Back in the late 70s and early 80s, a revolution took place: The Home Computer appeared. Previously existing commercial and scientific computers were clumsy and cumbersome in contrast: They used to fill huge cabinets, if not entire rooms. Power requirements were humongous. These machines produced a lot of waste heat and operating them required skilled staff.
Offering only limited capabilities, not uncommonly tailored to a specific task, these machines had little to no appeal to ordinary people, a.k.a. consumers. But all of this changed with the introduction of the Home Computer.
Thanks to advances in microelectronics (most notably the invention of the CPU chip by intel), Home Computers were small, had very modest power consumption, lacked much of the overwhelming intricacy of previous computers and were much more affordable. Their comparative simplicity was impressive: Machines, such as the legendary Commodore C64 connected to any existing TV set, were ready for operation at the flick of their power switch and could be tinkered with by the interested hobbyist.
Computer technology became thus experiencable to the general public.
As time and technology progressed, the Home Computer was finally superseded by the Personal Computer (IBM Compatible). Again, complexity increased. By the end of the 90s, it was not all that uncommon for a Personal Computer to have multiple processor chips, several disks and one or more graphic card(s). Nowadays, a PC may even have eight processors all crammed into a single package. A typical motherboard may have in excess of 20 electrical layers providing tens of thousands electrical connections, and most of the logic functions outside the CPU are bundled up into a highly obscure and monolithic Chipset. The most recent trend is the integration of the graphics controller into the CPU itself. There is, unfortunately, not much left here to tinker with and learn, for the hobbyist.
By the turn of the millennium, integration continued to scale upwards and the same process that miniaturized the Central Processing Unit into a single chip about three decades earlier, was now applied to the computer system as a whole: The System-on-a-Chip was born. This particular technique condenses the components of an entire computer system into a single chip and played a key role in enabling the development of the now ubiquitous smart phones and tablets. SoCs are present in an ever increasing number of computer peripherials and appliances such as Routers, Firewalls, NAS Boxes and Media Players.
By the Year 2012, the Raspberry Pi Foundation introduced the Raspberry Pi. A very small (the size of a credit card) and certainly inexpensive (around USD 25) Linux computer based on a SoC designed by Broadcom for multimedia applications.
Due to the high integration level the SoC offers, the mainboard itself becomes rather simple. There are no data and/or address buses anymore (at least not outside the SoC). In fact, the only real bus left on the board is the well known Universal Serial Bus. Besides the SoC, there’s only one other chip: A USB hub with built-in Ethernet functionality:
The USB-at-heart philosophy allows for a myriad of existing peripherals to be used with the Raspberry Pi, limited only by the availability of drivers. Thanks to its mature and well-proven Linux OS, drivers for countless devices are readily available for download at no cost. Raspberry’s Raspbian Linux also gives the Raspberry Pi access to a huge pre-existing library of application and system software and comes with a sweeping community of fellow users and developers.
In June 2014, the Raspberry Pi Foundation presented the Raspberry Pi Computing Module to the world. This module packs the Raspberry Pi SoC and a 4GB Flash memory chip onto a single, SODIMM-sized board, giving hardware designers the capability to easily integrate and expand on the Raspberry Pi concept.
The Raspberry Pi, it seems, is heralding a renaissance of the Home Computer.