Many many years ago, there were only CPUs, and no GPUs.
Take the ancient Atari 2600 games console as an example. It did not have a GPU. Instead, the CPU would have to make sure that the screen is drawn, at exactly the right moment.
When the TV was ready to receive the video signal from the games console, the CPU would have to stop processing the game so that it could generate and start the video signal that would be drawn on the screen. Then, the CPU would have to keep doing this for the entire screen frame's worth of information. Only when the video signal got to the bottom opposite corner of the screen could the CPU actually do any game mechanics updates.
This meant that the CPU of the Atari 2600 could only spend from memory about 30% of its power doing game processing, and the remaining 70% entirely dedicated to video updates as the CPU would literally race the electron beam in the TV.
So later on, newer generations of computers and game consoles started having dedicated circuitry to handle the video processing. They started out as microprocessors in their own right, eventually evolving into the massively parallel processing behemoths they are today.
cataraqui t1_j6luhhq wrote
Reply to ELI5: Why do computers need GPUs (integrated or external)? What information is the CPU sending to the GPU that it can't just send to a display? by WeirdGamerAidan
Many many years ago, there were only CPUs, and no GPUs.
Take the ancient Atari 2600 games console as an example. It did not have a GPU. Instead, the CPU would have to make sure that the screen is drawn, at exactly the right moment.
When the TV was ready to receive the video signal from the games console, the CPU would have to stop processing the game so that it could generate and start the video signal that would be drawn on the screen. Then, the CPU would have to keep doing this for the entire screen frame's worth of information. Only when the video signal got to the bottom opposite corner of the screen could the CPU actually do any game mechanics updates.
This meant that the CPU of the Atari 2600 could only spend from memory about 30% of its power doing game processing, and the remaining 70% entirely dedicated to video updates as the CPU would literally race the electron beam in the TV.
So later on, newer generations of computers and game consoles started having dedicated circuitry to handle the video processing. They started out as microprocessors in their own right, eventually evolving into the massively parallel processing behemoths they are today.