Google
online mba programs
earn an online mba .

Virtual reality

Virtual reality (VR) is a technology which allows a user to interact with a computer-simulated environment, be it a real or imagined one. Most current virtual reality environments are primarily visual experiences, displayed either on a computer screen or through special stereoscopic displays, but some simulations include additional sensory information, such as sound through speakers or headphones. Some advanced, haptic systems now include tactile information, generally known as force feedback, in medical and gaming applications. Users can interact with a virtual environment or a virtual artifact (VA) either through the use of standard input devices such as a keyboard and mouse, or through multimodal devices such as a wired glove, the Polhemus boom arm, and omnidirectional treadmill. The simulated environment can be similar to the real world, for example, simulations for pilot or combat training, or it can differ significantly from reality, as in VR games. In practice, it is currently very difficult to create a high-fidelity virtual reality experience, due largely to technical limitations on processing power, image resolution and communication bandwidth. However, those limitations are expected to eventually be overcome as processor, imaging and data communication technologies become more powerful and cost-effective over time.
t is unclear exactly where the future of virtual reality is heading. In the short run, the graphics displayed in the HMD will soon reach a point of near realism. The audio capabilities will move into a new realm of three dimensional sound. This refers to the addition of sound channels both above and below the individual. The virtual reality application of this future technology will most likely be in the form of over ear headphones.

Within existing technological limits, sight and sound are the two senses which best lend themselves to high quality simulation. There are however attempts being currently made to simulate smell. The purpose of current research is linked to a project aimed at treating Post Traumatic Stress Disorder (PTSD) in veterans by exposing them to combat simulations, complete with smells. Although it is often seen in the context of entertainment by popular culture, this illustrates the point that the future of VR is very much tied into therapuetic, training, and engineering demands. Given that fact, a full sensory immersion beyond basic tactile feedback, sight, sound, and smell is unlikely to be a goal in the industry. It is worth mentioning that simulating smells, while it can be done very realistically requires costly R&D to make each odor, and the machine itself is expensive and specialized, using capsules tailor made for it. Thus far basic, and very strong smells such as burning rubber, cordite, gasoline fumes, and so-forth have been made. Something complex such as a food product or specific flower would be prohibitively expensive.

In order to engage the other sense of taste, the brain must be manipulated directly. This would move virtual reality into the realm of simulated reality like the "head-plugs" used The Matrix. Although no form of this has been seriously developed at this point, Sony has taken the first step. On April 7, 2005, Sony went public with the information that they had filed for and received a patent for the idea of the non-invasive beaming of different frequencies and patterns of ultrasonic waves directly into the brain to recreate all five senses. There has been research to show that this is possible. Sony has not conducted any tests as of yet and says that it is still only an idea.

It has long been feared that Virtual Reality will be the last invention of man, as once simulations become cheaper and more widespread, no one will ever want to leave their "perfect" fantasies.

LCD projector

LCD (liquid crystal display) projectors usually contain three separate LCD panels, one each for the red, green, and blue components of the video signal. However single panel LCD projectors have been produced in the past. Light from a halogen lamp, which outputs an ideal color temperature and a broad spectrum of color is split by a prism into the three component colors. These lamps also have the ability to produce an extremely large amount of light within a small area, on average for current projectors of 2,000-4,000 ANSI lumens. As light passes through the LCD panels, individual pixels can be opened to allow light to pass, or closed to block the light, as if each little pixel were fitted with a Venetian blind. This activity modulates the light and produces the image that is projected onto the screen by allowing many different shades from each color LCD panel.

With a lens that "projects" the image on any flat surface and does not require large "furniture" (like a big TV would), LCD projectors tend to be smaller and much more portable than older systems. The best image quality can be accomplished with a blank white or grey surface to project on, and for this reason dedicated projection screens are often used.Perceived color in a projected image is a factor of both projection surface and projector quality. Since white is more of a neutral color, white surfaces are best suited for people wanting "natural color tones"; as such, white projection surfaces are more common in most business and school presentation environments. However it is also true that the darkest your darkest black will get is the equivalent of how dark your screen on which you're projecting on is. Because of this, some presenters and presentation space planners prefer to use grey screens, which make the user perceive higher contrast levels due to the image being projected on a darker background. The trade-off that is made with this "perceived higher contrast" levels is that the color tones will be off (like purple lips...etc), something can be adjusted through the use of the color and hue settings of the projector but can never be completely and correctly adjusted.
Early LCD systems were often intended to be used with existing overhead projectors, built as a large "plate" that was put on the projector in place of the transparencies. This provided the market with a stop-gap solution in the era when the computer was not yet the universal display medium so that there was a market for LCD projectors before their current main use became popular.
Another advantage of using this LCD projection system in large television sets is to allow better image quality as opposed to a single 60 inch television, although currently an equal of an LCD projector is the LG 100 inch LCD TV, still in prototype stages this TV is a huge advancement towards projector sized televisions. A common rule of thumb is that an LCD's image quality will decrease with a size increase. A workaround is to use a small lcd panel (or panels) and project them through a lens onto a rear projection screen to give a larger screensize (with a decreased contrast ratio) but without the quality loss.In 2004 and 2005, LCD front projection has been enjoying a come-back because of the addition of the dynamic iris which has improved perceived contrast up to the levels of DLP.The basic design of an LCD projector is frequently used by hobbyists who build their own DIY projection systems. The basic technique is to combine a high CRI HID lamp and ballast with a condenser and collector fresnel, an LCD removed from a common computer display and a triplet.

Blu-ray Disc

A Blu-ray Disc (also called BD) is a high-density optical disc format for the storage of digital media, including high-definition video.The name Blu-ray Disc is derived from the blue-violet laser used to read and write this type of disc. Because of its shorter wavelength (405 nm), substantially more data can be stored on a Blu-ray Disc than on the DVD format, which uses a red, 650 nm laser. A Blu-ray Disc can store 25 GB on each layer, as opposed to a DVD's 4.7 GB.Blu-ray Disc is similar to PDD, another optical disc format developed by Sony (which has been available since 2004) but offering higher data transfer speeds. PDD was not intended for home video use and was aimed at business data archiving and backup.Blu-ray Disc is currently in a "format war" with rival format HD DVD.The Blu-ray Disc system uses a blue-violet laser operating at a wavelength of 405 nm, similar to the one used for HD DVD, to read and write data. Conventional DVDs and CDs use red and infrared lasers at 650 nm and 780 nm respectively.
Because the Blu-ray Disc standard places the data recording layer close to the surface of the disc, early discs were susceptible to contamination and scratches and had to be enclosed in plastic caddies for protection. The consortium worried that such an inconvenience would hurt Blu-ray Disc's market adoption. Blu-ray Discs now use a layer of protective material on the surface through which the data is read.

The recent introduction of a clear polymer coating has given Blu-ray Discs substantial scratch resistance. The coating is developed by TDK and is called "Durabis". It allows BDs to be cleaned safely with only a tissue. The coating is said to successfully resist "wire wool scrubbing" according to Samsung Optical technical manager Chas Kalsi. It is not clear, however, whether discs will use the Durabis coating as standard or only in premium discs.

Both Sony and Panasonic replication methods include proprietary hard-coat technologies. Sony's rewritable media are sprayed with a scratch-resistant and antistatic coating. Verbatim recordable and rewritable Blu-ray Disc discs use their own proprietary hard-coat technology called ScratchGuard.
Although the Blu-ray Disc specification has been finalized, engineers continue working to advance the technology. Quad-layer (100 GB) discs have been demonstrated on a drive with modified optics. Furthermore TDK announced in August 2006 that they have created a working experimental Blu-ray Disc capable of holding 200 GB of data on a single side, using six 33 GB data layers.Such discs would almost certainly not work on some of today's Blu-ray Disc players, as these devices are only designed and tested on discs that meet the current specification.

High Definition Television

High-definition television (HDTV) is a digital television broadcasting system with a significantly higher resolution than traditional formats (NTSC, SECAM, PAL). While some early analog HDTV formats were broadcast in Europe and Japan, HDTV is usually broadcast digitally, because digital television (DTV) broadcasting requires much less bandwidth. HDTV technology was first introduced in the US during the 1990s by a group of electronics companies called the Digital HDTV Grand Alliance.

In the early 2000s, a number of high-definition television standards were competing for the still-developing niche markets. Current HDTV standards are defined by the International Telecommunication Union (ITU-R BT.709) as 1080 active interlace or progressive scan lines, or 720 progressive scan lines, using a 16:9 aspect ratio. HDTV is also capable of "theater-quality" audio because it uses the Dolby Digital (AC-3) format to support "5.1" surround sound. It should be noted that while HDTV is more like a theater in quality than conventional television, 35mm and 70mm film projectors used in theaters still have the highest resolution and best viewing quality on very large screens. Many HDTV programs are produced from movies on film as well as content shot in HD video.

The term "high-definition" can refer to the resolution specifications themselves, or more loosely to media capable of similar sharpness, such as photographic film. As of 2007, 24 million US households have HDTVs. However, only half are set up to actually receive HDTV programming as some consumers are not aware that they must get special receivers to get HDTV from cable, or use HDTV tuners to receive over-the-air broadcasts, and some are planning to use it in the future.

Optical computer

An optical computer is a computer that uses photons, rather than electrons, to manipulate, store and transmit data. Photons have fundamentally different physical properties to electrons, and researchers have attempted to make use of these properties to produce computers with performance and/or capabilities greater than those of electronic computers. Optical computer technology is still in the early stages: functional optical computers have been built in the laboratory, but none have progressed past the prototype stage.

Most research projects focus on replacing current computer components with optical equivalents, resulting in an optical digital computer system processing binary data. This approach appears to offer the best short-term prospects for commercial optical computing, since optical components could be integrated into traditional computers to produce an optical/electronic hybrid. Other research projects take a non-traditional approach, attempting to develop entirely new methods of computing that are not physically possible with electronics.

The fundamental building block of modern electronic computers is the transistor. To replace electronic components with optical ones, an equivalent "optical transistor" is required. This is achieved using materials with a non-linear refractive index. In particular, materials exist where the intensity of incoming light affects the intensity of the light transmitted through the material in a similar manner to the voltage response of an electronic transistor.This "optical transistor" effect is used to create logic gates, which in turn are assembled into the higher level components of the computer's CPU.

Another claimed advantage of optics is that it can reduce power consumption, but an optical communication system will typically use more power over short distances than an electronic one. This is because the shot noise of an optical communication channel is greater than the thermal noise of an electrical channel which, from information theory, means that we require more signal power to archive the same data capacity. However, over longer distances and at greater data rates the loss in electrical lines is sufficiently large that optical communications will comparatively use a lower amount of power. As communication data rates rise, this distance becomes shorter and so the prospect of using optics in computing systems becomes more practical.

A significant challenge to optical computing is that computation is a nonlinear process in which multiple signals must interact to compute the answer. Light, which is an electromagnetic wave, can only interact with another electromagnetic wave in the presence of electrons in a material and the strength of this interaction is much weaker for electromagnetic wave light than for the electronic signals in a conventional computer. This results in the processing elements for an optical computer requiring high powers and larger dimensions than for a conventional electronic computer using transistors.