The first multimedia computers needed a separate card to be able to decode video, some even remember how the first PC DVD players were sold with set-top cards.
Poco a poco gracias a las bondades de la ley de Moore el tamaño de estos decodificadores est terminino reduciendo hasta integrarse dentro de los procesadores gráficos, ahorrándonos tener que comprar hardware adicional para ver películas y series en nuestros PCs ya día de hoyar podemos in any place.
What is a digital signal processor?
Digital signal processors, or known by the acronym DSP, take an input data signal and apply an algorithm to it, generating an output signal or data. They can be used to decode multimedia file formats and thus decode an input data stream which may well belong to one multimedia file format or another.
They should not be confused with fixed function processors, because DSPs execute a program and this can be changed, but normally said program is not accessible at user level and it is the manufacturer of the DSP itself. even who has access to memory. program, which is normally updated through firmware updates made by the manufacturer itself.
Simply put, DSPs are a type of processor like processors, GPUs, etc. But in recent years, they have been increasingly integrated into different types of processors to speed up certain multimedia tasks, including real-time decoding of certain multimedia formats.
How do hardware codecs work on our PCs?
A decoder is nothing more than a DSP, which runs a program in which it converts the data blocks of a media file into a succession of images and sounds that we see on our television. But wouldn’t it be enough to be able to reproduce the images as they are? The answer is that it would be extremely inefficient because if we were to process uncompressed data we would need a huge amount of space for storage and bandwidth.
That’s why media files are compressed in different formats, it’s not really different from compressing or decompressing a file and the principle is the same. Compression can be based on things like taking common elements and giving them a certain value, storing the variation in color from image to image, etc.
The hardest part is encoding, which is converting a raw image, movie, or audio file to one of the formats, the process requires much more computing power than decoding, which is why to many times the new formats benefit from decoders well before encoders using this same form at.
Why are new video codecs being developed?
The reason is that new forms of consumption are emerging, for example the video codec for DVD-Video was H.263 or MPEG-2 which was good enough to reproduce the video on a conventional tube TV but when it did the jump to Blu-Ray, we saw that this format was not the best for the transmission speed of the new storage format at the time, so the creation of H.264 was necessary.
Currently, the era of optical formats has passed in history and content providers need to transmit over the network, although a fiber optic network has a higher speed than BluRay for content providers, it is much better to be able to convey so much content better. in a specific bandwidth, as this means savings for them in their server infrastructure, both in servers and in communication.
The counterpart at user level? Video and audio codecs compress more data and end up requiring more processing capacity since the number of steps to reconstruct the original data ends up being much greater, requiring more powerful DSPs for this.
Where are the hardware video codecs located?
In the PC normally in the GPU being one of the accelerators of the same, which are connected to the private Northbridge of the GPU itself. In the case of SoCs, because the CPU and GPU share the same Northbridge, they are connected in the general Northbridge of the same.
In the case of processors, it is not normal to find specialized hardware for video encoding and decoding, although there is no obstacle to their integration as a coprocessor, although it is not. usual because it is necessary to access the video memory because the display controller will be the one that reads the image decoded by the video CODEC as a frame buffer.
In SoCs, codecs have a direct relationship with other DSPs and / or accelerators, such as the ISP responsible for digitizing the images captured by the camera and even with specialized AI neural processors, which they work with side by side, either to digitize into the video forms what we capture with the camera and to change the resolution and correct the noise of the image and the sound of the videos.