Incredible Hulk, The (film)
In the 2008 film based on the Marvel comic series, scientist Bruce Banner is living in the shadows while on the run from the U.S. government. He must find an antidote for the monster he becomes whenever he loses his temper, but the warmongers who dream of abusing his powers won’t leave him alone. He must ultimately fight a beast of pure adrenaline and aggression whose might matches The Hulk’s own. Banner must call upon the hero within to rescue New York City from total destruction, if he is ever to be with the woman he loves again. The film starred Edward Norton as Dr. Banner.
Incredible Hulk, The (TV series)
In the 1978-82 series, Dr. David Banner is a brilliant research scientist studying the effects of gamma radiation on human strength. Using himself as a guinea pig, he infuses his body with direct gamma rays, but the experiment goes wrong. After that day, whenever he is under extreme stress or becomes very angry, his body undergoes a freakish physiological change, turning himself into a muscular and powerful green creature. Each time, after this Hulk of a creature deals with whatever threatens Dr. Banner, he morphs back to normal human form with no memory of his Hulk persona’s actions, and only tattered clothing as evidence. Dr. Banner is presumed dead after his home is destroyed and the Hulk is seen running from the grounds, so the scientist is left to roam the country in search of a cure, while trying to keep his darker side tame. The series, which was based on the popular Marvel Comics antihero, starred Bill Bixby as Banner and bodybuilder Lou Ferrigno as The Hulk. (Network executives changed the comic book character name from Bruce Banner to David Bruce Banner because they felt “David” was a more masculine name.)
Superheroes once walked and flew, patrolling the streets of Metroville and performing daily acts of heroism. Then, a string of lawsuits by disgruntled people they helped leads to a political nightmare and a public outcry, after which the Supers were forced into retirement and government-funded anonymity. Fifteen years later, Bob Parr (once known as Mr. Incredible) lives a mundane life as a suburban insurance agent. Although his wife Helen (formerly Elastigirl) has moved on and is more concerned with raising their children than battling evil, Bob still yearns for the good old days. His chance for new glory comes when he is approached by a shadowy organization and asked to help them … as a superhero … with a problem. But all is not as it seems, and Bob finds himself trapped by an unexpected villain, with only his family able to save him! Written and directed by Brad Bird, the 2004 Pixar hit’s voice cast included Holly Hunter and Craig T. Nelson.
An evil spirit, believed to sit or lie on sleeping humans. Also believed to have sexual intercourse with sleeping women, the incubus is the male counterpart to the succubus.
Industrial Light & Magic
Founded in an industrial warehouse space in Van Nuys, California by George Lucas in 1975 as the special effects lab for Star Wars – Episode IV: A New Hope, Industrial Light & Magic accumulated over 300 films on its resume by 2017, including Jumanji, Roger Rabbit, and The Mask, as well as films in the Lord of the Rings, rebooted Star Trek, Harry Potter, The Terminator, Back to the Future, and Mission: Impossible families of blockbusters. According to Stephen Spielberg: “I always thought that if ILM had run the space agency, we’d have colonized Mars by now.”
- The property of matter by which an object retains its state of rest, inactivity or uniform motion, unless acted upon by some external force.
- In physics, lack of movement or activity, especially when movement or activity is wanted or needed; the condition of not having the energy or desire that is needed to move, change, etc.; a property of matter by which something that is not moving remains still and something that is moving goes at the same speed and in the same direction until another thing or force affects it.
- In medicine, the lack of activity, especially as applied in a uterus during childbirth, when its contractions have decreased or stopped.
An idea conceived in the mid-1990s by Scott McCloud, author of Understanding Comics and Reinventing Comics, for how comics in digital environments might take different shapes and pathways from their print counterparts. These “webcomics” design strategies are based on treating the computer screen as a window rather than a page, allowing the reader to proceed in any of several directions, rather than simply left-to-right linear style. In the early 2000s, most who were using the term had only a sketchy idea of what it meant, but as internet bandwidth increased and hardware matured, such expanded-canvas comics became increasingly practical. Despite its shaky beginnings, the expanded-canvas approach isn’t just for computer geeks. In fact, most of the advantages to telling stories on a single-picture plane are surprisingly practical, and contribute to the reading experience without the need for radical designs. Print cartoonists, who state that they must make several compromises in story and art in order to fit the pages of a traditional comic, claim that the infinite canvas concept will improve comics in several areas, such as pacing, dynamic range and story flow. In addition, there would be much more freedom in the area of distance between panels, and artists will be better able to control the pacing of the story, as wider areas between panels tend to equal the slowing of time in a reader’s mind. Also, the new form would give way to the z-axis, allowing a third dimension for other storytelling purposes, including layered narratives, flashbacks, and tonal variation.
In this 2005-2006 DC Comics seven-issue crossover event written by Geoff Johns with illustrations by Phil Jimenez, George Perez, Ivan Reis and Jerry Ordway, many heroes and villains of many universes are damaged. The story picks up from the end of Crisis on Infinite Earths, when Alexander Luthor, Lois Lane of Earth-Two, Superboy-Prime and Superman Prime and Superman of Earth-Two, as the last surviving heroes of the destroyed Multiverse, disappeared. It is revealed that they have been monitoring the heroes of New Earth as they descend into darkness, and believe this generation unworthy of their legacy. Superman, Batman and Wonder Woman debate the current situation on Earth, and Batman makes it clear that Superman needs to be on Earth to help, and that Wonder Woman crossed the line by killing Maxwell Lord of Checkmate for mind-controlling Superman. They return to bring back the Multiverse, in an effort to restore the idealistic values of their own time. Superman of Earth-Two reveals that they are planning to make Earth-Two the template for the current reality instead of Earth-One. On Earth-Two, Superman and Lois are happy to be back on Earth-Two, in front of the Daily Star. However, Superman soon suffers a great loss of his own.
With the tower activated, Alexander fires a beam that brings back the rest of the Multiverse, returning the people of Earth to their native Earths, including Earth-S, Earth-898, and even Earth-Three. Alexander tries to merge different Earths together in order to create the perfect Earth. Superboy flies into Superboy-Prime and they both go through the tower, destroying not only it, but Alex’s Multiverse, remerging the Earths back into one New Earth. The two Supermen, Wonder Woman and Batman arrive too late, and Superboy dies in Wonder Girl’s arms.
The Multiverse is merged again, and a New Earth is born out of it. As the heroes return to Earth to mourn the dead, Superman, Batman, and Wonder Woman find themselves needing time to recover, physically and spiritually. As Diana flies off in her invisible jet to find out who she really is, and Batman goes on a journey with Dick Grayson and Tim Drake to rebuild himself, Clark Kent joins with his wife Lois, realizing that even though he will need time to recharge his powers, the Earth will be in pretty good hands with enough heroes present to take care of things.
First theorized in the early 1970s by Stephen Hawking, who, building on earlier work by Jacob Bekenstein at the Hebrew University of Jerusalem, suggested that black holes are not totally black. Hawking showed that particle/antiparticle pairs generated at the event horizon (or outer periphery) of a black hole would be separated. One particle would fall into the black hole while the other would escape, making the black hole a radiating body. Researchers created a theoretical model for the event horizon of a black hole that supports a controversial theory proposed in 2010 that suggested that gravity is an emergent force rather than a universal fundamental interaction. However, Hawking’s theory implied that over time, a black hole would eventually evaporate away, leaving nothing. This presented a problem for quantum mechanics, which dictates that nothing, including information, can ever be lost. If black holes withheld information forever in their singularities, there would be a fundamental flaw with quantum mechanics. The significance of the information paradox came to a head in 1997 when Hawking, together with Kip Thorne of the California Institute of Technology (Caltech), placed a bet with John Preskill of Caltech. At the time, Hawking and Thorne both believed that information was lost in black holes, while Preskill thought that it was impossible. Later, however, Hawking conceded the bet, saying he believed that information is returned, albeit in a disguised state.
Maulik Parikh of the University of Utrecht in the Netherlands, together with Frank Wilczek of the Institute of Advanced Study in Princeton, showed how information could leak away from a black hole. In their theory, information-carrying particles just within the event horizon could tunnel through the barrier, following the principles of quantum mechanics. This solution, however, was also debatable. Samuel Braunstein and Manas Patra of the University of York in the UK formulated a tunneling theory that looked more attractive to peers than Parikh and Wilczek’s theory. Normally, theorists dealing with black holes have to wrestle with the complex geometries of space–time arising from Einstein’s theory of gravitation, the theory of general relativity. In their model, Braunstein and Patra say that the event horizon is purely quantum mechanical in nature, with bits of quantum space tunneling through the barrier. Put simply, Braunstein and Patra said that tunneling seemed far more likely to be an intrinsic feature of black holes, so information is probably not lost after all.
String theorist Erik Verlinde of the University of Amsterdam, building on work by Ted Jacobsen of the University of Maryland, put forward a speculative idea for the origin of gravity. Under Verlinde’s proposal, gravity is not a fundamental interaction, but emerges from the universe trying to maximize disorder. Gravity is therefore an “entropic force,” a natural consequence of thermodynamics, much as one feels a force on a stretched rubber band as the molecules attempt to squiggle up into disordered states. Braunstein and Patra believe that their black-hole model goes in favor of Verlinde’s proposal. If gravity – not to mention inertia or space–time – is an emergent force, then it would not be utilized to unravel the basic information-loss mechanism of black holes, which is what the York researchers have shown.
Steve Giddings, a physicist specializing in quantum gravity at the University of California, Santa Barbara, does not think that Braunstein and Patra have addressed “the most central questions” of Verlinde’s proposal; however, he says they have put forward another important link between quantum gravity and information.
Created in January 1994 by Steven Kirsch, InfoSeek (then known as InfoSeek Guide) was a pay-for-use service, with the fees being dropped in August 1994. It was rechristened InfoSeek Search in February 1995, and became a true web search engine. Though InfoSeek was neither first nor overly original as a search engine, it did have a user-friendly interface and did add some unique links that demonstrated the marketing touch of Kirsch, such as UPS parcel tracking and a news feed. But where InfoSeek really hit the big leagues was with a strategic deal in 1995 that saw InfoSeek land as the default search engine for the Netscape web browser. That deal bumped Yahoo! from the default position and put InfoSeek on the screens of virtually all internet surfers.
Since June 1998, Disney had held about 42% of Infoseek’s shares, and the two firms jointly operate the portal. In July 1999, Walt Disney Co. agreed to acquire the remaining 58% of Infoseek Corp. it did not already own, combining its Buena Vista Internet Group with Infoseek to create a single Internet unit: go.com. Eisner said the deal put the company in a better position to use the Web to provide multimedia content online, such as Disney movies.
Short for English Socialism, the totalitarian political system/ideology of Oceania in George Orwell’s Nineteen Eighty-Four. It can also stand for the related terms “Newspeak for English Socialism” or “Newspeak for the English Socialist Party.”
A type of impact printer that propels droplets of ink directly onto the medium (or paper). Today, almost all inkjet printers produce color. Low-end inkjets use three ink colors: cyan (blue), magenta (red) and yellow, and can also produce a composite black. Four-color inkjets use black ink for pure black printing. The first inkjet mechanism that was developed sprayed a continuous stream of droplets aimed onto the paper. Although still used, most inkjets now use the drop-on-demand method, which forces a drop of ink out of a chamber by heat or electricity. The thermal method heats a resistor that forces a droplet of ink out of the nozzles by creating an air bubble in the ink chamber. Other inkjets use a piezoelectric technique that charges crystals which expand and “jet” the ink. Continuous Inkjet technology is a method that sprays continuous droplets of ink that either reach the paper or wind up in the return gutter.
As a noun, whatever data goes into a computer, from commands you type in with the keyboard to data from another computer or device. While input generally comes from humans, computers can also receive input from other sources. Such sources include audio and video devices that record movies and sound, media discs that install software, and even the internet, which is used to download files and receive data, such as e-mail or instant messages. A device that feeds data into a computer, such as a keyboard, scanner, digital camera or mouse, is called an input device. As a verb, the act of entering data into a computer.
In gaming, a slang term for instant kill.
The leading photo social platform on the internet, the application (or “app”), which boasts 300 million active monthly users, or 20% of internet users, launched on October 6, 2010. Only two months later, it had over 1 million users, and one year after that, the total was up to 10 million. On December 9, 2011, it was voted the “iPhone App of the Year.” April 3, 2012 saw the launch of Instagram’s Android app, and less than one week later, Instagram was purchased by Facebook. By July of 2012, there were 80 million active users of the photo-sharing site.
In massively multiplayer online role-playing games, or MMORPGs, an instance of a given dungeon generated exclusively for a specific player or party of players; contrast to the overworld, shared by all players on the server.
A text message sent via internet computer or mobile device, which appears at the receiving computer or phone almost at the moment it is sent from the sending point. Also known commonly by its initials “IM.”
Institute of Electrical and Electronics Engineers (IEEE)
An association dedicated to advancing technological innovation and excellence for the benefit of humanity, IEEE (commonly pronounced “I-triple-E”) is the world’s largest technical professional society. It was founded with the mission to serve professionals involved in all aspects of the electrical, electronic and computing fields and related areas of science and technology that underlie modern civilization.
The roots of the Institute go back to 1884, when electricity began to become a major influence on society. One major established electrical industry, which existed since the 1840s, had come to connect the world with a data communications system faster than the speed of transportation: the telegraph. Meanwhile, the telephone and electric power and light industries had just gotten underway. In the spring of that year, a small group of individuals in the electrical professions met in New York, referring to themselves as the American Institute of Electrical Engineers, or AIEE. That October, the AIEE held its first full meeting in Philadelphia, Pennsylvania. Many early leaders, such as founding President Norvin Green of Western Union, came from telegraphy. Others, like Thomas Edison, came from power, while Alexander Graham Bell represented the new telephone industry. Electric power spread rapidly, enhanced by innovations such as AC induction motors, long-distance AC transmission, and larger power plants. Companies such as AEG, General Electric, Siemens & Halske, and Westinghouse underwrote its commercialization. The AIEE became increasingly focused on electrical power and its ability to change people’s lives through the unprecedented products and services it could deliver. There was a secondary focus on wired communication, with both the telegraph and the telephone. Through technical meetings, publications and promotion of standards, the AIEE led the growth of the electrical engineering profession, and through local sections and student branches, it brought its benefits to engineers in widespread places.
Beginning with Guglielmo Marconi’s wireless telegraphy experiments in 1895-96, a new industry arose. What was originally called “wireless” telegraphy became radio with the electrical amplification possibilities inherent in the vacuum tubes that evolved from John Fleming’s diode and Lee de Forest’s triode. With the new industry came a new society in 1912, the Institute of Radio Engineers. The IRE was modeled on the AIEE but was devoted to radio, and then broadly to electronics. Through the leadership of the two societies, and with the applications of its members’ innovations to industry, electricity wove its way more deeply into every corner of life, via transistors, radar, television and computers. Increasingly, the interests of the professional societies overlapped. Membership in both societies grew, but beginning in the 1940s, the IRE grew faster, becoming the larger group by 1957. On January 1, 1963, the AIEE and the IRE merged to form the Institute of Electrical and Electronics Engineers, or the IEEE. At its formation, the IEEE had 150,000 members, 140,000 of whom resided in the United States. IEEE’s members include computer scientists, software developers, information technology professionals, physicists, medical doctors and many others, in addition to IEEE’s electrical and electronics engineering core. For this reason, the organization no longer goes by its more limiting full name, except on legal business documents, and is commonly referred to by its initials.
Insult sword fighting
A concept first introduced in the game The Secret of Monkey Island, the idea behind insult sword fighting is not about having the sharpest sword, but the sharpest wit. To start, one pirate opens with a line such as, “People fall at my feet when they see me coming.” The opponent then must retort with something that makes sense, such as, “Even before they smell your breath?” However, a nonsensical response (such as “How appropriate. You fight like a cow.”) will cause the “defending” pirate to lose the round. A pirate who wins three rounds consecutively (either defending or attacking) wins the fight. Guybrush Threepwood, the protagonist of the Monkey Island series, must master insult sword fighting in order to complete the three trials to become a pirate. He is trained by the sword master Captain Smirk, and then must defeat the sword master Carla. Carla is so good at insult sword fighting that the player can’t insult, only retort. This happens again in the The Curse of Monkey Island game, while facing Captain Rottingham. In Escape from Monkey Island, the player faces Ozzie Mandrill, but due to his thick Australian accent and gratuitous use of Australian slang, it is impossible to defeat him. Land-based insult sword fighting is pretty straight forward, but on the sea the rules change. When insult sword fighting outbreaks on a boat at sea, the retorts must not only make sense, they must also rhyme with the insults.
A whole counting number, whether positive or negative, or the number zero. Integers do not contain fractions or decimal points.
A mathematical object that can be interpreted as an area or a generalization of area. Together with derivatives, integrals are the fundamental objects of integral calculus.
A branch of mathematics concerned with the theory and applications (as in the determination of areas, lengths, and volumes and the solving of differential equations) of integrals and integration, and particularly the methods of ascertaining indefinite integrals and applying them to the solution of differential equations and the determining of areas, lengths, and volumes.
Engineering on the Intellivision video game system began at Mattel Toys in Hawthorne, California in 1978, just five years after the introduction of Pong, and only one year after the Atari 2600 game system was introduced. The Intellivision console and four game cartridges were successfully test marketed in 1979, and the following year, Intellivision was introduced nationwide. Mattel claimed at the time that the initial console was the core of a home system that would soon include a computer keyboard component, and 15 more Intellivision titles were released for a total of 19, with console sales reaching 175,000. Mattel then hired programmers to start developing software in-house. In 1981, a $6 million ad campaign touted Intellivision’s graphic superiority over the more popular Atari 2600. The media took note, and started covering the video game “war,” raising the profile of the entire home video game industry. Though the $300 Intellivision system was twice as expensive as the 2600, sales soared, reaching 850,000 total console sales by the end of 1981. The computer keyboard and educational software became low priorities compared to the games, and their release dates were pushed back. Mattel Toys spun off Mattel Electronics as a separate company, and hiring increased.
In 1982, the video game industry was valued at $1.5 billion. Mattel Electronics announced profits of over $100 million, with Intellivision units in over 2 million homes. The most popular Intellivision games sold over a million cartridges each. Total game titles available climbed to over 50. Mattel Electronics released the Intellivoice module and three voice games, while raising its ad budget for the year to over $20 million. The promised computer keyboard was released in limited test markets at $600 apiece, but a general release was repeatedly delayed. Game development staff at Mattel reached a total of 100, but in the same year, the higher-resolution ColecoVision video game system hit the market with popular arcade game titles, and took sales away from both Intellivision and Atari. In 1983, the classic brown-and-woodgrain Intellivision console was replaced by the cheaper ($150) light gray Intellivision II. Meanwhile, the computer keyboard component was officially cancelled in favor of the cheaper, less powerful Entertainment Computer System (ECS), and the System Changer module was introduced, allowing Atari 2600 cartridges to be played on Intellivision II consoles. The marketing campaign then pushed Intellivision as the system that played the most games.
New systems were released by other companies, including the Atari 5200 and the Vectrex, and games for all systems flooded the market, with many being rushed out and greeted with poor reviews. By mid-1983, titles available for Intellivision alone approached 100. The sheer volume of video game hardware and software created huge losses and panic within the industry. Mattel Electronics cut the price of its Intellivision II console to $69, cancelled all new hardware development, and laid off hundreds of employees, including 67% of its programming staff. Still, Mattel Electronics ended 1983 with a loss of over $300 million. By 1988, stores stopped carrying Intellivision consoles and games, and sales were made strictly through mail order. One year later, former Intellivision game developers were creating games for the Nintendo Entertainment System (also known as NES), and in 1990, INTV, the Mattel division that owned Intellivision, filed for bankruptcy protection. The division closed its doors the following year.
But that is not the end of the Intellivision story. In 1995, Blue Sky Rangers created a website on the history of the Intellivision system, and the traffic numbers proved that there was a continuing nostalgic interest in Intellivision. Formed by ex-Mattel Electronics programmers, Intellivision Productions, Inc. obtained exclusive rights to the Intellivision system and games, posting free PC and Mac versions of several of their games on the internet. In 1998 and 1999, respectively, the Intellivision Lives! collections for PC and Mac were published by Intellivision Productions, and between 1999 and 2003, collections entitled Intellivision Classics, Intellivision Rocks, Intellivision in Hi-Fi (a CD of music played on or inspired by the Intellivision console), Intellivision Greatest Hits collections are all released, and a line of handheld games was marketed under the Intellivision brand name.
International Business Machines (IBM)
In 1911, Charles F. Flint engineered the merger of Tabulating Machine Company, Computing Scale Company of America, and International Time Recording Company. The combined Computing- Tabulating- Recording Company (known as C-T-R) manufactured and sold machinery ranging from commercial scales and industrial time recorders to tabulators, punched cards, and meat and cheese slicers.
When the diversified businesses of C-T-R proved difficult to manage, Flint turned for help to former National Cash Register Company executive Thomas J. Watson, Sr. In 1914, Watson joined the company as general manager. Watson implemented a series of effective business tactics: generous sales incentives, a focus on customer service, and an insistence on well-groomed, dark-suited salesmen. Watson preached a positive outlook, and his favorite slogan, “THINK,” became a mantra for C-T-R’s employees. Within 11 months of joining C-T-R, Watson became its president. The company focused on providing large-scale, custom-built tabulating solutions for businesses, and during Watson’s first four years, revenues more than doubled to $9 million. He also expanded the company’s operations to Europe, South America, Asia and Australia.
In the years following World War I, C-T-R’s engineering and research staff developed new and improved mechanisms to meet the broadening needs of its customers. In 1920, C-T-R launched its Electric Accounting Machine, and the following year, the company acquired the business of the Ticketograph Company of Chicago, and certain patents and other property of the Pierce Accounting Machine Company. The Carroll Rotary Press was developed in 1924 to produce cards at high speed, and punched card capacity was doubled. Due to the firm’s expanding areas of production, C-T-R’s name was formally changed to International Business Machines Corporation on February 14, 1924.
During the Great Depression, IBM (also known colloquially as “Big Blue”) managed to grow while the rest of the U.S. economy floundered. Under Watson’s guidance, IBM was among the first corporations to provide group life insurance, survivor benefits and paid vacations. Watson kept his workers busy producing new machines, even while the demand was slack. Thanks to the resulting large inventory of equipment, IBM was ready when the Social Security Act of 1935 brought the company a landmark government contract to maintain employment records for 26 million people. It was called “the biggest accounting operation of all time,” and it went so well that orders from other U.S. government departments quickly followed.
At the onset of World War II, Watson placed all IBM facilities at the disposal of the U.S. government. IBM’s product line expanded again to include bombsights, rifles and engine parts, for a total of more than three dozen major ordnance items. Watson set a nominal 1% profit on those products, and used the money to establish a fund for widows and orphans of IBM war casualties.
The war years also marked IBM’s first steps toward computing. The Automatic Sequence Controlled Calculator, also called the Mark I, was completed in 1944 after six years of development with Harvard University. It was the first machine that could execute long computations automatically. Over 50 feet long, eight feet high and weighing almost five tons, the Mark I took less than a second to solve an addition problem, about six seconds for multiplication, and 10-12 seconds for division. Through the 1940s, IBM introduced the Selective Sequence Electronic Calculator, the 604 Electronic Calculating Punch, and the Card-Programmed Electronic Calculator, the first IBM product designed specifically for computation centers.
IBM made a number of key technological changes in the decade of the 1950s. In 1952, the company introduced the IBM 701, its first large computer based on the vacuum tube. The tubes were quicker, smaller and more easily replaced than the electromechanical switches in the Mark I. The 701 executed 17,000 instructions per second and was used primarily for government and research work. But vacuum tubes rapidly moved computers into business applications such as billing, payroll and inventory control. By 1959, transistors were replacing vacuum tubes.
In 1957, the U.S. Air Force used the IBM 7090, one of the first fully transistorized mainframes capable of performing 229,000 calculations per second, to run its Ballistic Missile Early Warning System. IBM also introduced FORTRAN (FORmula TRANSlation), a computer language based on algebra, grammar and syntax rules. It became one of the most widely used computer languages for technical work.
After nearly four decades as IBM’s chief executive, Thomas J. Watson Sr., passed the title of president on to his son, Thomas J. Watson Jr., in 1952. He became chief executive officer just six weeks before his father’s death.
Under Thomas J. Watson, Jr., there were also innovations in marketing. IBM introduced the System/360, the first large “family” of computers to use interchangeable software and peripheral equipment, on April 7, 1964. Fortune magazine dubbed the company’s bold departure from the monolithic, one-size-fits-all mainframe “IBM’s $5 billion gamble.” In 1969, IBM changed the way it sold technology. Rather than offer hardware, services and software exclusively in packages, marketers “unbundled” the components and offered them for sale individually. Unbundling gave birth to the multibillion-dollar software and services industries, of which IBM is a world leader today.
The 1970s saw the end of more than a half-century of Watson family leadership, as Thomas J. Watson Jr. stepped down as CEO in 1971. During the same year, the floppy disk became the standard for storing personal computer data. IBM’s supermarket checkout station, introduced in 1973, used glass prisms, lenses and a laser to read product prices. Also that year, bank customers began making withdrawals, transfers and other account inquiries via the IBM 3614 Consumer Transaction Facility, the precursor to today’s Automatic Teller Machines, or ATMs.
Thanks to the birth of the IBM Personal Computer (PC), by 1981, the IBM brand began to enter homes, small businesses and schools. IBM made a significant investment in research which produced four Nobel Prize winners in Physics, made great strides in expanding computing capabilities, and achieved breakthroughs in mathematics, memory storage and telecommunications.
IBM’s local area network (later to be known as LAN) concept was introduced in 1985. It permitted personal computer users to exchange information and share printers and files within a building or complex. With the further development of the computer, IBM laid a foundation for network computing and numerous other applications. Throughout the 1980s and early 1990s, IBM was thrown into turmoil by back-to-back revolutions. The PC revolution placed computers directly in the hands of millions of people, then the client/server revolution sought to link all of those PCs with larger computers. The focus was on the desktop and personal productivity, not on business applications across the enterprise. By the autumn of 1995, IBM’s new vision of network computing had become the company’s overarching strategy, driving the next phase of industry growth. In May 1997, IBM dramatically demonstrated computing’s potential with Deep Blue, a 32-node IBM RS/6000 SP computer programmed to play chess on a world-class level. In a six-game match in New York, Deep Blue defeated World Chess Champion Garry Kasparov. At 200 million chess moves per second, Deep Blue’s calculating power was staggering, and had a wide range of potential applications in fields calling for the systematic exploration of a vast number of variables, among them forecasting weather, modeling financial data, and developing new drug therapies. In 2001, IBM debuted a new generation of servers: the eServer, designed to meet entirely new, unprecedented demands on the underlying infrastructure supporting e-business.
A global electronic communications network linking millions of smaller computer networks in more than 190 countries, including commercial, educational, governmental and other networks, all of which use the same set of communications protocols. The first recorded description of social interactions enabled through networking was a series of memos written in August 1962 by J.C.R. Licklider of the Massachusetts Institute of Technology (MIT), discussing his concept of a “galactic network.” Licklider became the first head of the computer research program at Defense Advanced Research Projects Agency (DARPA), beginning in October 1962. While at DARPA, he convinced his associates Ivan Sutherland, Bob Taylor and MIT researcher Lawrence G. Roberts of the importance of this networking concept.
In July 1961, Leonard Kleinrock at MIT published the first paper on packet switching theory, as well as the first book on the subject, published in 1964. Kleinrock convinced others of the theoretical feasibility of communications using packets rather than circuits, which was a major step along the path toward computer networking. The other key step was to make the computers talk together. With this goal in mind, working with Thomas Merrill, Roberts connected the TX-2 computer in Massachusetts to the Q-32 in California with a low speed dial-up telephone line in 1965, creating the first (however small) wide area network (or WAN) ever built.
In late 1966, Roberts went to DARPA to develop the computer network concept and quickly put together his plan for the “ARPANET,” publishing it in 1967. At the same conference where he presented the concept, the Rand Group presented a paper on the packet-switching networks for secure voice used in the military in 1964. It happened that the work at MIT (1961-1967), at Rand (1962-1965) and at National Physical Laboratory (NPL) in Middlesex, England (1964-1967) had all proceeded simultaneously, without any of the researchers knowing about the other projects. After Roberts and the DARPA-funded community had refined the overall structure and specifications for the ARPANET, all this came together in September 1969, when BBN installed the first Interface Message Processor (IMP) at UCLA, and the first host computer was connected. One month later, when SRI was connected to the ARPANET, the first host-to-host message was sent from Kleinrock’s laboratory to SRI. Computers were added quickly to the ARPANET during the following years, and in December 1970, the Network Working Group (NWG) finished the initial ARPANET Host-to-Host protocol, called the Network Control Protocol (NCP). As the ARPANET sites completed implementing NCP in 1971-1972, the network users began to develop applications. In October 1972, Kahn organized a large, very successful demonstration of the ARPANET at the International Computer Communication Conference (ICCC). This was the first public demonstration of this new network technology. Also in 1972, the initial use of electronic mail (now known commonly as “e-mail” or “email”) was introduced. Motivated by the need of the ARPANET developers for an easy communication and coordination mechanism, Ray Tomlinson at BBN Technologies (originally Bolt, Beranek and Newman) wrote the basic email message send-and-read software, and in July, Roberts expanded its utility by writing the first email utility program to list, selectively read, file, forward, and respond to messages. From there, email took off as the largest network application for over a decade.
The original ARPANET grew into the internet, based on the idea that there would be multiple independent networks, and a key to the rapid growth of the internet has been the free and open access to the basic documents, especially the specifications of the protocols. By 1985, the internet was already well established as a technology supporting a broad community of researchers and developers. It was beginning to be used by other communities for daily computer communications, and e-mail was being used broadly across several communities, often with different systems.
High-speed networks that carry internet traffic, a backbone is a principal data route between large, strategically interconnected networks and core routers. Internet backbones are the largest online data connections, providing networking facilities to relatively small but high-speed internet service providers and requiring high-speed bandwidth connections and high-performance servers/routers. Backbone networks are primarily owned by commercial, educational, government and military entities because they provide a consistent way for internet service providers (ISPs) to keep and maintain online information in a secure manner. Key features of an internet backbone include:
- ISPs are either connected directly to their contingency backbones or to some larger ISP that is connected to its backbone.
- The smaller networks are interlinked to support the multi-versatile backup that is required to keep the internet services intact in case of failure. This is done through transit agreements and peering processes.
- ISPs also share features and traffic burden.
One of the original web browsers, playing a pivotal role in the early days of the internet, the first version of Internet Explore was the result of a licensing agreement between Microsoft and Spyglass (the small company behind the Mosaic web browser) in 1995, which emerged amid intense competition between Spyglass and Netscape: Just a few months after Internet Explorer 1, Microsoft released version 2.0, this time attempting to replicate some of the features and design aspects that had made Netscape Navigator so popular. Commanding nearly 90% of the market by 1996, many early websites had been and were being developed for compatibility with Navigator only, so Microsoft was forced to design IE 2.0 to import bookmarks from Netscape and to support its HTML features, so webpages looked as close to identical on each browser as possible. IE 2.0 was also the first version made available for the Mac OS, although that did not happen until six months after it was launched for Windows.
The launch of IE 4.0 in October 1997 is largely considered the beginning of the First Browser War, and was a turning point for Microsoft. By integrating IE functionality with Windows, Microsoft gained an edge on Navigator and boosted its market share, but that also led the firm and its CEO Bill Gates on a path to the infamous 2001 antitrust lawsuit, United States v. Microsoft. Internet Explorer 5 was released in March 1999, after a developer preview in June and a public preview in November of the previous year, and was later bundled with the release of Windows 98 Second Edition in September 1999. By the time Microsoft upgraded to version 6.0, IE 5 had exceeded 80% of the web browser market share, thanks largely to its integration with Windows.
Internet Explorer 6 shipped with both Windows XP and Windows Server 2003, launching it to nearly 90% market share by 2002. However, IE 6 is remembered for its security failings, which were mostly by design. By 2004, US-CERT issued a vulnerability report that said IE’s combined vulnerabilities and deep integration with Windows made it a severe liability, while several other security experts urged users not to use it. By 2006, PC World rated IE 6 the eighth-worst tech product of all time, claiming that it “might be the least secure software on the planet.” It was also the last version named “Microsoft Internet Explorer,” as a result of the antitrust case. More than five years after version 6 was released, Windows Internet Explorer 7 was released as the default browser for Windows Vista, and it could also replace IE 6 on Windows XP, but perhaps because of the long time between new versions, IE 7 struggled to keep up with even IE 6 in terms of market share, and opened the door for Mozilla Firefox to compete.
Largely considered an improvement on the two previous versions of Internet Explorer, IE 8 was still too late to the market to make up for the ground Microsoft had lost to competitors, namely Mozilla Firefox and Google Chrome. IE8 introduced new developer tools and features like accelerators and suggested sites, and improved performance and stability by correcting some of the problems of its predecessor, but it still couldn’t get Microsoft back into the browser race.
With 2011’s Internet Explorer 9, Microsoft changed the user interface and focused on HyperText Markup Language (HTML) 5, CSS3, XHTML, and other performance aspects to help reverse course on its former market-leading web browser. The company released IE 9 on its own, without an accompanying operating system, and promoted it heavily with a series of epic, high-budget TV commercials and a Tumblr site that poked fun at its own negative image. Reviewers largely deemed IE 9 to be on par with Firefox and Google Chrome technologically, but the new Explorer didn’t disrupt the market the way Microsoft had hoped it would. 2012 saw the release of Internet Explorer 10, designed to run only on Windows 8, but the reception for IE 10 was generally lukewarm among reviewers, although that may have been partly due to the general disdain for Windows 8 as a whole.
What would ultimately become the final Windows web browser with the name “Internet Explorer” was released with Windows 8.1 in 2013, and its performance was deemed to be at least on-par with both Chrome and Firefox, and although its compatibility still lagged behind the others, its features were comparable. With their flagship browser being widely mocked or ignored by most in the tech world over the past few years, Microsoft confirmed in March 2015 that it would be killing the “Internet Explorer” brand in favor of a new web browser in Windows 10.
Internet Protocol (IP) address
Internet Relay Chat (IRC)
A chat system developed by Jarkko Oikarinen of Finland in the late 1980s. Unlike older chat systems, IRC was not limited to just two participants. There can be many discussions going on at once, with each one assigned a unique channel. Users join an IRC discussion with an IRC client program and internet access. In turn, the IRC server is responsible for making sure that all messages are broadcast to everyone participating in a discussion.
A signal informing a program that a particular event has occurred. When a program receives an interrupt signal, it takes a specified action depending on the signal, which can be to halt its current operation, switch to some other task, or even to ignore the signal. Some interrupt signals can cause a program to suspend itself temporarily to service the specific interrupt. Interrupt signals can come from a variety of sources (for example, every keystroke generates an interrupt signal) and can also be generated by other devices, such as a printer, to indicate that some event has occurred. These are called “hardware interrupts.” Interrupt signals initiated by programs are called “software interrupts,” but can also be called “traps” or “exceptions.” Standard PCs support 256 types of software interrupts and 15 types of hardware interrupts. Each type of software interrupt is associated with an interrupt handler, a routine that takes control when the interrupt occurs. For example, when you press a key on your keyboard, this triggers a specific interrupt handler. The complete list of interrupts and associated interrupt handlers is stored in a table called the interrupt vector table.
A routine that takes control when a system interrupt occurs, an interrupt handler typically deals with low-level events in the hardware of a computer system, such as a character arriving at a serial port, or the ticking of a real-time clock. Normally, the operating system is responsible for reactivating a process system which is waiting for such a low-level event. It detects this by a shared flag, a shared queue, or by some other synchronization mechanism.
See Xanadu, Madame.
An atom or grouping of atoms that carries a positive or negative electric charge, after gaining or losing one or more electrons.
Often used in internet chat rooms, the abbreviation stands for “in real life,” and is typically used by chat room visitors to let someone know that they are talking about something in their actual life, away from the internet world. Can also be used to differentiate between an actor/actress and the character they play.
Iron Age of Comics (1985–present)
An alternate interpretation of the Ages of Comics timeline, in which the Dark Age of Comics (1985–98) and the Modern Age of Comics (mid-1980s–present) are viewed as one era, defined by alternate realities and universes, retcons, reboots, and an overall adult (or “serious”) tone.
The comic book brainchild of Stan Lee, Larry Lieber, and artists Don Heck and Jack Kirby, Iron Man first appeared in Marvel Comics’ Tales of Suspense #39 in 1963. The brilliant Tony Stark was fascinated with building and controlling machines from a very early age. At the age of 15, he entered the undergraduate electrical engineering program at the Massachusetts Institute of Technology (MIT), and graduated with two Master’s degrees by age 19. At the age of 21, the playboy inherited Stark Enterprises when his parents were killed in a car accident. While taking part in a field test of his military hardware at one of his international plants; Stark’s party was attacked by terrorists. During the skirmish, a land mine went off and lodged a piece of shrapnel near Tony’s heart. Afterward, he equipped one of the battlesuits he had been developing with a magnetic field generator to prevent the shrapnel from reaching his heart. He eventually returned to the US, where his new life was a torment. His armor’s chest plate had to be worn constantly and required frequent recharging. He kept the armor a secret from everyone, including his fiancée. Turning suicidal and drinking heavily, Tony was supported by Joanna, with whom he shared his secret identity. Joanna encouraged him to use his armor as a super hero.
Iron Man has been portrayed in television cartoons and live-action films. The character appeared in his own cartoon series, which originally aired in 1966, and in Hollywood blockbuster films starring Robert Downey Jr. as Tony Stark, beginning in 2008.
A Japanese fad that involves decorating a car with paint and/or decals to resemble characters from anime, gaming or manga. Translated from Japanese as “painmobile,” which is derived from the feeling of driving a car around that would typically be very embarrassing.