The GPU division of AMD was born in 1985 under the name Array Technology Incorporated, the company specialized from the beginning in the creation of graphics chips for use in graphics cards for PC, a market born at that time with the aim of provide cheaper solutions for IBM Standards or to offer your own.
ATI Rev 3 graphics solution
In the dawn of the PC, you could use TV-based monitors, which didn’t have a radio receiver to display TV content, or a higher resolution monochrome screen. In offices around the world, the second type was more widely used because it allowed working with 80 characters per line of text versus the 40-character monitor derived from television.
But an alternative to the IBM option called Hercules appeared, which also allowed up to 80 columns, but 9 characters each. This also allowed a monochrome mode with a frame buffer of 720 x 350 pixels, but required a special graphics card, which meant a loss of compatibility with the IBM MDA card.
The ATI Rev 3 graphics solution became very famous when it first hit the market in 1985, integrating a CGA graphics card and a Hercules into one unit. Due to this versatility, it became very famous at the time, as it avoided the use of several types of graphics cards in a PC depending on the type of monitor.
ATI Wonder graphics cards
ATI began to make a name for itself with its EGA Wonder, an EGA-standard graphics card that combined the Graphics Solution Rev 3 chipset with an EGA chip from Chips Technologies, so that this card could handle various graphics standards without having to have multiple adapters. , which resulted in enormous compatibility in the face of the enormous confusion of PC graphics standards.
The card was replaced by the ATI VGA Wonder, which included a VGA chipset developed by ATI itself and maintained compatibility with the rest of the previous standards also on a single card.
ATI Mach graphics cards
In 1990, ATI continued to make clones of IBM standards, with the ATI Mach 8 they made a clone of the IBM 8514 / A graphics chip, which was the start of the XGA standard, this standard allowed a resolution of 1024 × 768 pixel screen, with a palette of 256 colors, or 640×480 with 16 bits per pixel (65,536 colors) and tried to copy some of the graphics functionality of the Commodore AMIGA graphics chip.
One of the advantages of AMIGA over the PC was the so-called Blitter, a hardware unit that allowed data to be copied from one part of the memory to another, manipulating the data on the fly through Boolean operations. This allowed AMIGA with a less powerful processor to have powerful drawing applications.
The peculiarity of this standard is that it was born as a parallel extension of VGA for professional monitors. Its biggest advantage? It was the addition of blitter style graphical operations like those found in the Commodore Amiga. Thus, visual operations like drawing lines, copying blocks, or filling the color of a shape ended up being possible without having to use CPU cycles for it.
Once the pre-VGA graphics standards were completely obsolete, ATI released the ATI Mach 32, which included a VGA core, thus unifying the ATI Wonder and ATI Mach series into one, although ATI continued to use the ATI Wonder name for some products. . apart from its main line of graphics cards, such as video decoders, which were used at that time to decode CD video files.
Starting with the ATI MACH64, they unified the VGA chipset and MACH into one graphics processor.
ATI RAGE graphics cards
The huge success of 3Dfx’s Voodoo Graphics and the emergence of games based on real-time 3D graphics have pushed companies like ATI to catch up to not disappear from the graphics card market, their answer was ATI Rage.
The problem with most graphics cards is that processors, since the standardization of the Intel Pentium in homes, made 3D game scenes much faster than graphics cards back then, which were designed for work with the 486 and therefore were going to wake.
ATI’s solution was to modify its ATI MACH64 chipset to make a series of changes that would make its line of graphics cards competitive for a future of 3D gaming with real-time graphics.
- The possibility of processing textures and filtering them has been added, for this a 4 KB texture cache has been added.
- A triangle configuration or rasterization unit has been added, to relieve the processor from such a heavy load.
However, the first ATI Rage arrived late and while they have the power of a Voodoo Graphics, pale against the Voodoo 2 and the Riva TNT, ATI’s response was none other than to increase the number of ‘texture units from 1 to 2 and put a 128 bit. bus, which they sold as ATI Rage 128.
Another card that stood out within the ATI Rage line was the FURY MAXX, which was the first ATI graphics card to implement what in the future would end up being Crossfire technology, which allowed ATI to place two Rage 128 chips in one card.
The first of the ATI Radeon graphics
ATI ditched the Rage brand starting with the first Radeon, which was the company’s first DirectX 7 carrier card. It used the ATI R100 GPU which was originally named Rage 7, in reality it was nothing more than a Rage 128 with a built-in T&L unit and is that full DirectX 7 support, involved the implementation units that calculate scene geometry when rendering 3D scenes in real time.
Thus, the Radeon 7500 at heart was nothing more than a Rage 128 with the fixed function units for the geometry calculation. It did not have the same performance as the first and second generation GeForces that came out at the same time, but it was the turning point for the company to make the necessary change.
Pero lo that the dejo el camino libre a ATI para convertse into the rival of NVIDIA durante años, el motivo de ello fue the desaparición y posterior compra de 3Dfx por parte de NVIDIA y el descalabro de S3 y Matrox con tarjetas gráficas que no estaban a height.
The purchase of ArtX by ATI
ATI technologies of the 2000s owed a lot to the purchase of a small company of former Silicon Graphics engineers called ArtX. When ATI bought it, their most recent work was the graphics chip for the Nintendo GameCube console, although they had previously worked on Nintendo 64 and some of its engineers on the legendary Silicon Graphics Reality Engine.
The purchase was important because its human capital and its know-how completely transformed ATI, making it possible to look at itself against NVIDIA. Paradoxically, despite the fact that ATI Technologies has never designed a GPU for Nintendo consoles, the purchase of ArtX made its logo appear on Nintendo consoles with ArtX, GameCube and Wii technologies.
Radeon 9700, the beginning of the golden age
The ATI Radeon 9700 under the R300 chipset has become one of the most important graphics cards in ATI’s history, if not the most important of all, and is equivalent to the first NVIDIA GeForce in terms of impact.
In order to be competitive, AMD bought the startup ArtX, which was founded by the same former engineers from Silicon Graphics who had worked on the 3D technology of the Nintendo64 and Nintendo GameCube consoles. The result of the purchase? The R300 GPU, with which ATI was at the forefront.
Plus, they had the perfect storm, as the launch of the Radeon 9700 coincided with NVIDIA’s biggest slippage in all of its history, the GeForce FX, the ATI Radeon legend had started and the rivalry between NVIDIA and ATI started from there. point in history.
Xenos, ATI’s first console GPU
Originally, Microsoft was going to release what was to be Windows Vista in 2003 or 2004 under the code name Longhorn and with it a new version of its DirectX, 10, which was not the one that was ultimately released for PC but a more cropped version that did. do not appear at the end and for which ATI Technologies designed a GPU, which was the first to have the same unit to run the different types of shader.
When Windows Longhorn was not released, the code name for Vista at that time. ATI and Microsoft decided to reposition it as ATI Xenos from Xbox 360. Being the first ATI design in a video game console and opening a collaboration between ATI and Microsoft, which is now part of AMD, which still lasts today with the Xbox series. And also get a future collaboration with SONY for the PlayStation. This is why it is one of the most important graphics chips in ATI’s history.
ATI Radeon HD 2000
The first ATI graphics card with unified PC shaders appeared later than the GeForce 8800, although it was first implemented by ATI in history with the Xbox 360 GPU.
However, the architecture was completely different and you cannot compare the Microsoft console GPU with the architecture of the R600, which in its most powerful version the Radeon HD 2900 had around 320 ALUs, which under the Terascale architecture is 64 VLIW5 units and therefore 4 calculation units.
However, the ATI R600 GPUs on which the Radeon HD 2000 was based were a disappointment, as they were unable to compete with the NVIDIA GeForce GTX 8800. A year later, ATI released the HD 3000 series under the 55nm node and polishing some design flaws of the R600 architecture, especially its internal communication ring.
ATI Radeon HD 4870
Following the fiasco of the R600 architecture variants, AMD decided to upgrade the R600 architecture from 320 ALUS to 800 ALU, thus creating the R700 architecture which started with the ATI Radeon HD 4000 as the ATI Radeon HD 4870. became. In the Absolute Queen, ATI had once again regained the throne in terms of performance and again faced direct competition.
ATI Radeon HD 5000
Instead of designing a completely new architecture for DirectX 11, at ATI they decided to release the R800 chipset, which was an R700, but optimized and improved for DirectX 11, well, it wasn’t actually optimized for DirectX 11. , but ATI I’m releasing Terascale 2 as a way out of the way and with very little change really.
One of the new things in DirectX 11 was Compute Shaders, but initially it was raised by unifying the list of screen drawing and computing into one. It was a huge disaster for ATI and NVIDIA, but ATI at the time was rather focused on making possible the idea of AMD, of which it was already a part, of uniting CPU and GPU on a single chip. All his efforts were therefore focused on the AMD Fusion project.
The ATI Technologies 5000 series marked the end of the ATI brand in graphics cards, which was later renamed into Radeon Technology Group or RTG and continues to this day.