7 Best All In One Personal Computers | March 2017
- 27-inch 4k display
- realsense 3d camera
- red and black colors look childish
|Model||ideacentre AIO Y910 - F|
- comes with windows 10 pro
- includes a responsive stylus
- nvidia geforce 965m graphics
- intel core i7-6700t processor
- over 1tb of storage
- energy star certified
- 6-speaker audio system
- 2x hdmi outputs
- 512gb ssd and a 2tb hard drive
Evolution of the Personal Computer
The first computer was designed in 1939 by American physicist, John Atanasoff. Though it would be another thirty-two years before the personal computer would make its debut in 1971. The invention of integrated circuits brought the computer to new heights. Large numbers of miniature transistors were packed on thin silicone chips to create integrated circuit chips. These chips made computers faster, smaller, more powerful, and less expensive: truly a revolution for the computer age.
These powerful computers would enable scientists to create the next generation of personal computers. It was not long before the microprocessor was developed, which effectively brought the computer to the common man's house. The first computers were ready for home assembly by the mid 1970s.
Noticing a gap in service, some companies took it upon themselves to assemble computers for their clients, which boosted popularity and made the computer more user-friendly. The need for humans to socialize, coupled with the ability of computers to create networks, would lead to the invention of the internet in the early 1990s.
Since the 1990s, computers have become smaller, more efficient, and much more powerful. The slowest computer of the modern era is far ahead of computers created just 10 years ago. This rapid evolution has helped to shape our entire world around the technologies in use today.
The Rise Of Touch Screen Technology
Touch screen devices may seem to be an invention of the modern era, but the first one was actually created in 1965. Since its inception, the touch screen has taken over the technology sector due to its tangibility and ease of use.
The first displays were called capacitive touch screens. They use an insulator and an electronic impulse to decide when the screen is being touched. As the human body makes its own electricity, the finger creates a great electronic impulse. The first capacitive displays were very basic, understanding only one touch at a time, and they were unable to compute the amount of pressure that was being used.
Resistive touch screens debuted next. Created in the 1970's, resistive touch screens did not rely on electrical currents. The basic resistive touch screen was composed of a conductive sheet lying on top of the screen which contained the sensors to determine touch. While it doesn't seem to be a major advancement in touch screen technology, it removed the necessity to use the finger to control the screen. This meant that any number of objects could be used to input data. Though this was revolutionary in computing at first, resistive touch screens are not used on personal computers at all any more. Their use is currently limited to touchpads in places like restaurants and grocery stores.
It wasn't until the 1980s that touch screen technologies made advancements towards what we now know in tablets and personal computers. As large companies scrambled to create the next big touch screen technologies, a relatively unknown player stepped forward to bring the world multi-touch technology, which paved the way for the touch screen computers and tablets used today.
Is Holographic Technology Next For Personal Computers?
The notion of a hologram usually brings to mind scenes from virtually any science fiction series. After experiencing 3D theaters, and the 4D films seen at many amusement parks, holographic technology seems to be the only avenue yet to be mastered.
Contrary to popular belief, most of the holograms experienced today are not actual holograms: they are projections using an antiquated illusion known as Pepper's ghost. The Pepper's ghost illusion was created by scientific demonstrator, John H. Pepper in the 1860s, and is still in use today. The illusion in its most basic form involves two rooms: the stage and the hidden room. Viewers sit facing the stage, while an angled mirror projects images from behind the stage, creating an illusion in front of the audience.
These reflections can seem to make objects appear out of thin air or turn one object on stage into another with a simple flash of light masking the exchange. It was as effective in the 19th century as it is today. Though the basic principle remains the same, modern Pepper's ghost illusions are much more involved, often using digital effects and projections to create what the eye calls a hologram.
True hologram technology on your desktop is still far off. The development of the laser in the 1960s gave birth to the hologram as it is known today, and since then it has found its way into many aspects of daily life. One example is the use of security holograms to help determine a credit card's authenticity. Lifelike holograms, like the ones seen in sci-fi movies, are created by laser light capturing a detailed visual rendition of an object to be played back later.
There are significant setbacks faced when trying to install this technology into displays. The data required to perform such a task would be very difficult to contain in a usable model that could fit on a tabletop or be easily moved. The other basic adversity is the idea that the user can be looking at the unit from any number of angles, which would negatively effect how the image is displayed. This does not mean that holographic technology will never exist though; there are many companies researching its use in consumer products.