Google
online mba programs
earn an online mba .

Virtual reality

Virtual reality (VR) is a technology which allows a user to interact with a computer-simulated environment, be it a real or imagined one. Most current virtual reality environments are primarily visual experiences, displayed either on a computer screen or through special stereoscopic displays, but some simulations include additional sensory information, such as sound through speakers or headphones. Some advanced, haptic systems now include tactile information, generally known as force feedback, in medical and gaming applications. Users can interact with a virtual environment or a virtual artifact (VA) either through the use of standard input devices such as a keyboard and mouse, or through multimodal devices such as a wired glove, the Polhemus boom arm, and omnidirectional treadmill. The simulated environment can be similar to the real world, for example, simulations for pilot or combat training, or it can differ significantly from reality, as in VR games. In practice, it is currently very difficult to create a high-fidelity virtual reality experience, due largely to technical limitations on processing power, image resolution and communication bandwidth. However, those limitations are expected to eventually be overcome as processor, imaging and data communication technologies become more powerful and cost-effective over time.
t is unclear exactly where the future of virtual reality is heading. In the short run, the graphics displayed in the HMD will soon reach a point of near realism. The audio capabilities will move into a new realm of three dimensional sound. This refers to the addition of sound channels both above and below the individual. The virtual reality application of this future technology will most likely be in the form of over ear headphones.

Within existing technological limits, sight and sound are the two senses which best lend themselves to high quality simulation. There are however attempts being currently made to simulate smell. The purpose of current research is linked to a project aimed at treating Post Traumatic Stress Disorder (PTSD) in veterans by exposing them to combat simulations, complete with smells. Although it is often seen in the context of entertainment by popular culture, this illustrates the point that the future of VR is very much tied into therapuetic, training, and engineering demands. Given that fact, a full sensory immersion beyond basic tactile feedback, sight, sound, and smell is unlikely to be a goal in the industry. It is worth mentioning that simulating smells, while it can be done very realistically requires costly R&D to make each odor, and the machine itself is expensive and specialized, using capsules tailor made for it. Thus far basic, and very strong smells such as burning rubber, cordite, gasoline fumes, and so-forth have been made. Something complex such as a food product or specific flower would be prohibitively expensive.

In order to engage the other sense of taste, the brain must be manipulated directly. This would move virtual reality into the realm of simulated reality like the "head-plugs" used The Matrix. Although no form of this has been seriously developed at this point, Sony has taken the first step. On April 7, 2005, Sony went public with the information that they had filed for and received a patent for the idea of the non-invasive beaming of different frequencies and patterns of ultrasonic waves directly into the brain to recreate all five senses. There has been research to show that this is possible. Sony has not conducted any tests as of yet and says that it is still only an idea.

It has long been feared that Virtual Reality will be the last invention of man, as once simulations become cheaper and more widespread, no one will ever want to leave their "perfect" fantasies.

LCD projector

LCD (liquid crystal display) projectors usually contain three separate LCD panels, one each for the red, green, and blue components of the video signal. However single panel LCD projectors have been produced in the past. Light from a halogen lamp, which outputs an ideal color temperature and a broad spectrum of color is split by a prism into the three component colors. These lamps also have the ability to produce an extremely large amount of light within a small area, on average for current projectors of 2,000-4,000 ANSI lumens. As light passes through the LCD panels, individual pixels can be opened to allow light to pass, or closed to block the light, as if each little pixel were fitted with a Venetian blind. This activity modulates the light and produces the image that is projected onto the screen by allowing many different shades from each color LCD panel.

With a lens that "projects" the image on any flat surface and does not require large "furniture" (like a big TV would), LCD projectors tend to be smaller and much more portable than older systems. The best image quality can be accomplished with a blank white or grey surface to project on, and for this reason dedicated projection screens are often used.Perceived color in a projected image is a factor of both projection surface and projector quality. Since white is more of a neutral color, white surfaces are best suited for people wanting "natural color tones"; as such, white projection surfaces are more common in most business and school presentation environments. However it is also true that the darkest your darkest black will get is the equivalent of how dark your screen on which you're projecting on is. Because of this, some presenters and presentation space planners prefer to use grey screens, which make the user perceive higher contrast levels due to the image being projected on a darker background. The trade-off that is made with this "perceived higher contrast" levels is that the color tones will be off (like purple lips...etc), something can be adjusted through the use of the color and hue settings of the projector but can never be completely and correctly adjusted.
Early LCD systems were often intended to be used with existing overhead projectors, built as a large "plate" that was put on the projector in place of the transparencies. This provided the market with a stop-gap solution in the era when the computer was not yet the universal display medium so that there was a market for LCD projectors before their current main use became popular.
Another advantage of using this LCD projection system in large television sets is to allow better image quality as opposed to a single 60 inch television, although currently an equal of an LCD projector is the LG 100 inch LCD TV, still in prototype stages this TV is a huge advancement towards projector sized televisions. A common rule of thumb is that an LCD's image quality will decrease with a size increase. A workaround is to use a small lcd panel (or panels) and project them through a lens onto a rear projection screen to give a larger screensize (with a decreased contrast ratio) but without the quality loss.In 2004 and 2005, LCD front projection has been enjoying a come-back because of the addition of the dynamic iris which has improved perceived contrast up to the levels of DLP.The basic design of an LCD projector is frequently used by hobbyists who build their own DIY projection systems. The basic technique is to combine a high CRI HID lamp and ballast with a condenser and collector fresnel, an LCD removed from a common computer display and a triplet.

Blu-ray Disc

A Blu-ray Disc (also called BD) is a high-density optical disc format for the storage of digital media, including high-definition video.The name Blu-ray Disc is derived from the blue-violet laser used to read and write this type of disc. Because of its shorter wavelength (405 nm), substantially more data can be stored on a Blu-ray Disc than on the DVD format, which uses a red, 650 nm laser. A Blu-ray Disc can store 25 GB on each layer, as opposed to a DVD's 4.7 GB.Blu-ray Disc is similar to PDD, another optical disc format developed by Sony (which has been available since 2004) but offering higher data transfer speeds. PDD was not intended for home video use and was aimed at business data archiving and backup.Blu-ray Disc is currently in a "format war" with rival format HD DVD.The Blu-ray Disc system uses a blue-violet laser operating at a wavelength of 405 nm, similar to the one used for HD DVD, to read and write data. Conventional DVDs and CDs use red and infrared lasers at 650 nm and 780 nm respectively.
Because the Blu-ray Disc standard places the data recording layer close to the surface of the disc, early discs were susceptible to contamination and scratches and had to be enclosed in plastic caddies for protection. The consortium worried that such an inconvenience would hurt Blu-ray Disc's market adoption. Blu-ray Discs now use a layer of protective material on the surface through which the data is read.

The recent introduction of a clear polymer coating has given Blu-ray Discs substantial scratch resistance. The coating is developed by TDK and is called "Durabis". It allows BDs to be cleaned safely with only a tissue. The coating is said to successfully resist "wire wool scrubbing" according to Samsung Optical technical manager Chas Kalsi. It is not clear, however, whether discs will use the Durabis coating as standard or only in premium discs.

Both Sony and Panasonic replication methods include proprietary hard-coat technologies. Sony's rewritable media are sprayed with a scratch-resistant and antistatic coating. Verbatim recordable and rewritable Blu-ray Disc discs use their own proprietary hard-coat technology called ScratchGuard.
Although the Blu-ray Disc specification has been finalized, engineers continue working to advance the technology. Quad-layer (100 GB) discs have been demonstrated on a drive with modified optics. Furthermore TDK announced in August 2006 that they have created a working experimental Blu-ray Disc capable of holding 200 GB of data on a single side, using six 33 GB data layers.Such discs would almost certainly not work on some of today's Blu-ray Disc players, as these devices are only designed and tested on discs that meet the current specification.

High Definition Television

High-definition television (HDTV) is a digital television broadcasting system with a significantly higher resolution than traditional formats (NTSC, SECAM, PAL). While some early analog HDTV formats were broadcast in Europe and Japan, HDTV is usually broadcast digitally, because digital television (DTV) broadcasting requires much less bandwidth. HDTV technology was first introduced in the US during the 1990s by a group of electronics companies called the Digital HDTV Grand Alliance.

In the early 2000s, a number of high-definition television standards were competing for the still-developing niche markets. Current HDTV standards are defined by the International Telecommunication Union (ITU-R BT.709) as 1080 active interlace or progressive scan lines, or 720 progressive scan lines, using a 16:9 aspect ratio. HDTV is also capable of "theater-quality" audio because it uses the Dolby Digital (AC-3) format to support "5.1" surround sound. It should be noted that while HDTV is more like a theater in quality than conventional television, 35mm and 70mm film projectors used in theaters still have the highest resolution and best viewing quality on very large screens. Many HDTV programs are produced from movies on film as well as content shot in HD video.

The term "high-definition" can refer to the resolution specifications themselves, or more loosely to media capable of similar sharpness, such as photographic film. As of 2007, 24 million US households have HDTVs. However, only half are set up to actually receive HDTV programming as some consumers are not aware that they must get special receivers to get HDTV from cable, or use HDTV tuners to receive over-the-air broadcasts, and some are planning to use it in the future.

Optical computer

An optical computer is a computer that uses photons, rather than electrons, to manipulate, store and transmit data. Photons have fundamentally different physical properties to electrons, and researchers have attempted to make use of these properties to produce computers with performance and/or capabilities greater than those of electronic computers. Optical computer technology is still in the early stages: functional optical computers have been built in the laboratory, but none have progressed past the prototype stage.

Most research projects focus on replacing current computer components with optical equivalents, resulting in an optical digital computer system processing binary data. This approach appears to offer the best short-term prospects for commercial optical computing, since optical components could be integrated into traditional computers to produce an optical/electronic hybrid. Other research projects take a non-traditional approach, attempting to develop entirely new methods of computing that are not physically possible with electronics.

The fundamental building block of modern electronic computers is the transistor. To replace electronic components with optical ones, an equivalent "optical transistor" is required. This is achieved using materials with a non-linear refractive index. In particular, materials exist where the intensity of incoming light affects the intensity of the light transmitted through the material in a similar manner to the voltage response of an electronic transistor.This "optical transistor" effect is used to create logic gates, which in turn are assembled into the higher level components of the computer's CPU.

Another claimed advantage of optics is that it can reduce power consumption, but an optical communication system will typically use more power over short distances than an electronic one. This is because the shot noise of an optical communication channel is greater than the thermal noise of an electrical channel which, from information theory, means that we require more signal power to archive the same data capacity. However, over longer distances and at greater data rates the loss in electrical lines is sufficiently large that optical communications will comparatively use a lower amount of power. As communication data rates rise, this distance becomes shorter and so the prospect of using optics in computing systems becomes more practical.

A significant challenge to optical computing is that computation is a nonlinear process in which multiple signals must interact to compute the answer. Light, which is an electromagnetic wave, can only interact with another electromagnetic wave in the presence of electrons in a material and the strength of this interaction is much weaker for electromagnetic wave light than for the electronic signals in a conventional computer. This results in the processing elements for an optical computer requiring high powers and larger dimensions than for a conventional electronic computer using transistors.

Evolution

Biological evolution is the change in a population's inherited traits from generation to generation. These traits are encoded as genes that are copied and passed on to offspring during reproduction. Mutations and other random changes in these genes can produce new or altered traits, resulting in inheritable differences (genetic variation) between organisms. Evolution occurs when these differences become more common or rare in a population. This happens randomly through genetic drift, and based on the reproductive value of traits through natural selection.

Natural selection occurs because organisms with traits that help them survive and reproduce tend to have more offspring. In doing so, they will pass more copies of their inheritable traits on to the next generation. This tends to cause advantageous traits to become more common in each generation, while disadvantageous ones become rarer.Over time, this process can result in varied adaptations to environmental conditions.As differences in and between populations accumulate, species may split into new species. The similarities between organisms suggest that all known species are descended from a single ancestral species through this process of gradual divergence.

The theory of evolution by natural selection was first put forth in detail in Charles Darwin's 1859 book On the Origin of Species. In the 1930s, Darwinian natural selection was combined with Mendelian inheritance to form the modern evolutionary synthesis.With its enormous explanatory and predictive power, this theory has become the central organizing principle of modern biology, providing a unifying explanation for the diversity of life on Earth.

Dark Matter

In astrophysics and cosmology, dark matter is matter of unknown composition that does not emit or reflect enough electromagnetic radiation to be observed directly, but whose presence can be inferred from gravitational effects on visible matter. According to present observations of structure larger than galaxy-sized as well as Big Bang cosmology, dark matter accounts for the vast majority of mass in the observable universe. Among the observed phenomena consistent with dark matter observations are the rotational speeds of galaxies and orbital velocities of galaxies in clusters, gravitational lensing of background objects by galaxy clusters such as the Bullet cluster, and the temperature distribution of hot gas in galaxies and clusters of galaxies. Dark matter also plays a central role in structure formation and galaxy evolution, and has measurable effects on the anisotropy of the cosmic microwave background. All these lines of evidence suggest that galaxies, clusters of galaxies, and the universe as a whole contain far more matter than that which interacts with electromagnetic radiation: the remainder is called the "dark matter component".

The composition of dark matter is unknown, but may include new elementary particles such as WIMPs, axions, and ordinary and heavy neutrinos, as well as astronomical bodies such as dwarf stars and planets (collectively called MACHOs), and clouds of nonluminous gas. Current evidence favors models in which the primary component of dark matter is new elementary particles, collectively called non-baryonic dark matter.

The dark matter component has vastly more mass than the "visible" component of the universe.At present, the density of ordinary baryons and radiation in the universe is estimated to be equivalent to about one hydrogen atom per cubic metre of space. Only about 4% of the total energy density in the universe (as inferred from gravitational effects) can be seen directly. About 22% is thought to be composed of dark matter. The remaining 74% is thought to consist of dark energy, an even stranger component, distributed diffusely in space. Some hard-to-detect baryonic matter makes a contribution to dark matter, but constitutes only a small portion.Determining the nature of this missing mass is one of the most important problems in modern cosmology and particle physics. It has been noted that the names "dark matter" and "dark energy" serve mainly as expressions of our ignorance, much as the marking of early maps with terra incognita.

Ball Lightning

In January 1984, ball lightning measuring about four inches in diameter entered a Russian passenger aircraft and, according to the Russian news release, "flew above the heads of the stunned passengers. In the tail section of the airliner, it divided into two glowing crescents which then joined together again and left the plane almost noiselessly." The ball lightning left two holes in the plane.

Ball lightning is another natural phenomena for which science has yet to come up with a full explanation. The problem for scientists is that the manifestation of the phenomenon is so rare that it is almost impossible to study. Attempts have been made to recreate it artificially in the laboratory, but an actual specimen of naturally occurring ball lightning has yet to be captured for study. This may be impossible since the phenomenon is fleeting - floating about for awhile and then fading away or exploding with a loud pop.

What makes ball lighting so fascinating and puzzling is its strange "behavior." Witnesses have said that it moves about as if with a kind of intelligence, following patterns on walls or furniture, and seeming to avoid obstacles. More mysterious still is its ability to pass through solid objects. Sometimes it leaves holes, as with the airliner above, but it has also been seen to pass through window glass and even walls without even leaving a mark.

Computer Science

Computer science, or computing science, is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. Computer science has many sub-fields; some emphasize the computation of specific results (such as computer graphics), while others relate to properties of computational problems (such as computational complexity theory). Still others focus on the challenges in implementing computations. For example, programming language theory studies approaches to describing computations, while computer programming applies specific programming languages to solve specific computational problems with solutions. A further subfield, human-computer interaction, focuses on the challenges in making computers and computations useful, usable and universally accessible to people.

The history of computer science predates the invention of the modern digital computer by many years. Machines for calculating fixed numerical tasks, such as the abacus, have existed since antiquity. Wilhelm Schickard built the first mechanical calculator in 1623. Charles Babbage designed a difference engine in Victorian times, and around 1900 the IBM corporation sold punch-card machines. However all of these machines were constrained to perform a single task, or at best, some subset of all possible tasks.

During the 1940s, as newer and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors. As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1960s, with the creation of the first computer science departments and degree programs.Since practical computers became available, many applications of computing have become distinct areas of study in their own right.

General relativity

General relativity (GR) [also called the general theory of relativity (GTR) and general relativity theory (GRT)] is the geometrical theory of gravitation published by Albert Einstein in 1915/16. It unifies special relativity and Sir Isaac Newton's law of universal gravitation with the insight that gravitational force can be regarded as the manifestation of the curvature of space and time, with this curvature being produced by the mass-energy and momentum content of the matter in space-time. General relativity is distinguished from other metric theories of gravitation by its use of the Einstein field equations to relate space-time content and space-time curvature.

General relativity is currently the most successful gravitational theory, being almost universally accepted and well supported by observations. The first success of general relativity was in explaining the anomalous perihelion precession of Mercury. Then in 1919, Sir Arthur Eddington announced that observations of stars near the eclipsed Sun confirmed general relativity's prediction that massive objects bend light. Since then, many other observations and experiments have confirmed many of the predictions of general relativity, including gravitational time dilation, the gravitational redshift of light, signal delay, and gravitational radiation. In addition, numerous observations are interpreted as confirming one of general relativity's most mysterious and exotic predictions, the existence of black holes.

In the mathematics of general relativity, the Einstein field equations become a set of simultaneous differential equations which are solved to produce metric tensors of space-time. These metric tensors describe the shape of the space-time, and are used to obtain the predictions of general relativity. The connections of the metric tensors specify the geodesic paths that objects follow when traveling inertially. Important solutions of the Einstein field equations include the Schwarzschild solution (for the space-time surrounding a spherically symmetric uncharged and non-rotating massive object), the Reissner-Nordström solution (for a charged spherically symmetric massive object), and the Kerr metric (for a rotating massive object).

In spite of its overwhelming success, there is discomfort with general relativity in the scientific community due to its being incompatible with quantum mechanics and the reachable singularities of black holes (at which the math of general relativity breaks down). Because of this, numerous other theories have been proposed as alternatives to general relativity. An early and still-popular class of modifications is Brans-Dicke theory, which, although not solving the problems of singularities and quantum gravity, appeared to have observational support in the 1960s. However, those observations have since been refuted and modern measurements indicate that any Brans-Dicke type of deviation from general relativity must be very small if it exists at all.

Special relativity

The special theory of relativity was proposed in 1905 by Albert Einstein in his article "On the Electrodynamics of Moving Bodies".Earlier, Galileo's principle of relativity had stated that all uniform motion was relative, and that there was no absolute and well-defined state of rest; a person on the deck of a ship may be at rest in his opinion, but someone observing from the shore would say that he was moving. Einstein's theory combines Galilean relativity with the postulate that all observers will always measure the speed of light to be the same no matter what their state of uniform linear motion is.

This theory has a variety of surprising consequences that seem to violate common sense, but that have been verified experimentally. Special relativity overthrows Newtonian notions of absolute space and time by stating that distance and time depend on the observer, and that time and space are perceived differently, depending on the observer. It yields the equivalence of matter and energy, as expressed in the mass-energy equivalence formula E = mc², where c is the speed of light in a vacuum. Special relativity agrees with Newtonian mechanics in their common realm of applicability, in experiments in which all velocities are small compared to the speed of light.

The theory was called "special" because it applies the principle of relativity only to inertial frames. Einstein developed general relativity to apply the principle generally, that is, to any frame, and that theory includes the effects of gravity. Special relativity doesn't account for gravity, but it can deal with accelerations.

Although special relativity makes some quantities relative, such as time, that we would have imagined to be absolute based on everyday experience, it also makes absolute some others that we would have thought were relative. In particular, it states that the speed of light is the same for all observers, even if they are in motion relative to one another. Special relativity reveals that c is not just the velocity of a certain phenomenon - light - but rather a fundamental feature of the way space and time are tied together. In particular, special relativity states that it is impossible for any material object to accelerate to light speed.