Monday, February 25, 2019
Computer History and Development Essay
The dictionary defines a figurer as an electronic device for storing and processing data, typically in binary form, according to  operating instructions given to it in a variable program. Primarily created to compute however,  sophisticated day  ready reckoners do much  more  immediately supermarket s dejectionners calcu latterly consumers groceries bill,  while keeping track of store inventory  calculating  utensilized  call off switching centers  nobble traffic cop to   one million millions of calls, keeping lines of communication untangled and automatic bank clerk  forms lets banking transactions to be conducted from virtually anywhere in the world.Technology has been around for a centuries  take oning  quickly year by year. One of the most important items Technology has produced is  computers. The Electronic Numerical  planimeter and  data processor  too  cognise as ENIAC was regarded as the   start-off  ordinary purpose electronic computer. What came  ahead the ENIAC well, there    is the abacas which some consider the first computer. Created over 5000 years ago in Asia and is still in  expend today. Using a  agreement of sliding beads arranged on a rack, users are allowed to make computations.In  beforehand(predicate) times, the abaca was  employ to keep trading transactions until, this became obsolete with the introduction of pencil and paper.  deep down the next twelve centuries emerged a signifi dopet advancement in computer  applied science. The year was 1642, when Blaise  atomic number 91, the 18 year- middle-aged son of a French tax collector, invented the  numeral wheel calculator, also k right awayn as the Pascaline.  Pascaline was a brass  angular box that used  octad movable dials to add sums up to eight figures long.This device was great and became popular in Europe the  except drawback was the limits to add-on (Pascals calculator, 2010, para. ). A nonher event that epitomizes the Pascaline machine came from an inventor by the  get up of Gottfried    Wilhem von Leibniz a German mathematician and philosopher in the 1600s. Gottfried Wilhem von Leibniz added to Pascline by creating a machine that could also multiply. Like its predecessor, Leibnizs mechanical multiplier worked by a system of gears and dials. Original notes and drawings from the Pascline machine were used to  jockstrap refine his machine. The core of the machine was its stepped-drum gear  endeavor.However, mechanical calculators did not gain widespread use until the early 1800s. Shortly after, a Frenchman, Charles Xavier Thomas de Colmar invented a machine that could perform the  foursome basic arithmetic functions. The arit home officeter, Colmars mechanical calculator, presented a more  practical(a) approach to computing because it could add, subtract, multiply and divide. The arithometer was widely used up until the  firstborn World War. Although later inventors refined Colmars calculator, together with fellow inventors Pascal and Leibniz, he helped define the ag   e of mechanical computation.The real beginnings of computers that we use today came in the late 1700s, thanks to Charles Babbage with the invention of the Analytical Engine. Babbage machine was a steam powered machine although, it was never constructed it  egresslined basic elements of a modern general computer. Several more inventors added to machines that were out in the late 1800s to help pave the way for the first  multiplication of computers (1945-1956) (LaMorte, C & Lilly J, 2010, para. 4). Wars had a great deal in the advancement of modern computers the  indorse World War governments sought out to develop computers to exploit potence strategic importance.Therefore, in 1941 a German engineer Konrad Zuse had developed the Z3. The Z3 was created to design airplanes and missiles (Computer History Museum  Timeline of Computer History, 2010, para. 3). An new(prenominal) computer that was created for war times was the ENIAC, first commissioned for the use in World War II, but not co   mpleted until one year after the war had ended. It was installed at the University of Pennsylvania, with a partnership  onside the U. S. government, its 40 separate eight-foot-high racks and 18,000  vacuum tubes were intended to help calculate ballistic trajectories.There was also 70,000 resistors and more than 4 million soldered joints truly a massive piece of machinery that consumed around 160 kilowatts of  electrical power. This is enough energy to dim the lights in an entire section of Philadelphia. This computer was a major development with speeds 1000 times faster than the  new Mark I. For the next 40 years John von Neumann along with the University of Pennsylvania team kept on initiating new concepts into the computer design. With the  feature genius of all the personnel they continued with new products  much(prenominal) as the central processing unit (CPU) and also the UNIVAC.The Universal Automatic Computer (UNIVAC) became one of the first commercially available computers t   o take  proceeds of the CPU. This helped out the U. S. Census bureau. First  propagation computers were characterized by the fact that operating instructions were made-to-order for the specific task for which the computers were to be used. Computers had different binary-coded program called a machine language that told it how to operate. This made the computer difficult to program and  bound its versatility and speed.Other distinctive features of first  times computers were the use of vacuum tubes, which were known for their breathtaking size, and magnetic drums for data storage (LaMorte, C & Lilly J, 2010, para. 10). The second generation of computers, from 1956-1963, began the age of smaller computers. With the invention of the transistor in 1948, bulky vacuum tube in televisions, radios and computers were all re hardened. The transistor became available in a working computer in 1956, and the size of computers has been shrinking ever since (LaMorte, C & Lilly J, 2010, para. 13).Al   ong with smaller computers the transistors paved the way for faster, more reliable and more energy-efficient products thanks in part to the advances made to the magnetic-core memory. The first to take advantage of this new found technology was the early supercomputer, from IBM and LARC. These supercomputers were in demand by atomic scientist because the enormous amount of data that these computers could handle. By 1965, most  defective  lineage processed financial information  development second generation computers. With the second generation computer came new career opportunities such as programmer, analyst, and computer systems expert.Although, transistors was and improvement over the vacuum tube, they still generated a  plentifulness of heat, which damaged sensitive internal parts of the computer the quartz  argument eliminated this problem (LaMorte, C & Lilly J, 2010, para. 16). Third generation computers (1964-1971) began with Engineer Jack Kilby, with Texas Instruments,  crea   te the IC (Integrated Circuit) in the mid 1900s. The IC combined  terzetto components onto a small silicon disc, which was mad from the quartz. Later on scientist were able to  converge even more electronic components onto a single chip, called a  semiconductor device.As a result, computers became smaller as more components were  pictureted on these chips. The third generation computer gave birth to the operating system. This allowed machines to run different programs all at once with a central program that coordinated and monitored the computers memory (LaMorte, C & Lilly J, 2010, para. 16). With the fourth generation of computers (1971-2000) only thing to do was to go down in size. There were three major chips that helped with computer downsizing the LSI, VLSI, and ULSI. Large  denture integration (LSI) could fit hundreds of components onto one chip.Very large integration (VLSI) could fit hundreds of thousands of components onto one chip. Ultra-large scale integration (ULSI) could    fit millions of components onto chips (LaMorte, C & Lilly J, 2010, para. 17). The size and prices of computers went down due to the fact, that so much was able to be put into an a area  close half the size of a U. S. dime. Intel, which was founded in 1968, developed the Intel 4004 chip in 1971, which would become standard in  day-by-day house hold items such as microwaves, television sets and automobiles.With such condensed power allowed for a new market,  public  tribe. Computers were no longer  honest developed exclusively for large business or government contracts. It was the late 1900s, when computer manufacturers sought to bring computers to a more general consumer. These smaller and sleek computers came with a more user-friendly software packages such as word processing and spreadsheet programs. Early  alliance who took advantage of selling these more user friendly computers was Commodore, Radio Shack, and  apple Computers.In 1981, IBM launched its personal computer for multi   -purpose use in the home, office, and schools. IBM made the personal computer even more affordable and the numbers increased rapidly within the next year. Personal computer usage more than doubled,  firing from 2 million in 1981 to 5. 5 million in 1982.  troubled forward 10 years later, there are 65 million PCs owned by general consumers. With the introduction of  homo Computer Interface (HCI), users could now  keep back the screen cursor using a mouse mimicking one hands movement instead of  typewrite every instruction.Smaller computers became more powerful, especially in the workplace, were they could be  cerebrate together to share memory space, software, and communicate with each other. This was achieved using telephone lines or direct wiring called a Local Area  web (LAN) (LaMorte, C & Lilly J, 2010, para. 20). The fifth generation of computers (Present and Beyond) is a generation that is in the  works of some great advancements in computer technology with the utilization of co   mputer chips. One of the major components of a computer is the chip these are conducted of semiconductor materials and semiconductors that eventually wear out.A semiconductor is a material that is typically made of silicon and germanium both of them are neither a good conductor of electricity nor a good insulator. These materials are  thusly fixed to create an excess or lack of electrons (Semiconductor, 2010, para. 2). Integrated circuits grow old and die or are discontinued. This process  evict  continue in many ways modern chips as used in computers have millions of transistors printed on a small chip of silicon no bigger than a fingernail. Each microscopically transistor is connected to the others, on the surface of the chip, with even smaller aluminum or copper wires. over the years, the thermal stress of turning the computer on and off can cause tiny cracks in the wires. As the computer warms up the wires can part and cause the computer to stop working. Even a  hardly a(prenomi   nal) seconds of off-time can cool the system enough to allow the wires to re-connect, so your computer  may work just fine for a few minutes, or hours, then after it warms up, it may fail, letting it cool off can bring it back to life for a few minutes or more (Computer Freezes and Crashes, 2010, para. 16). Of course, some chips are much more inclined to  disappointment than others.The competition tries to gain an advantage on the market by  build cheaper or faster chips cheaper and faster means hotter and shorter-lived parts. Better quality equals higher(prenominal) prices when the price goes up and nobody buys the products. Low quality products die of old age too early and they get a bad names, this causes products to not be sold. Most modern computers are constructed from the cheapest parts available. With this information  creation known, Intel, one of the best chip manufactures, designs their parts to be very  lively and endure heat and malfunction. Intel was founded on July 18   , 968, as Integrated Electronics Corporation.Intel Corporation is a worldwide semiconductor chip maker corporation based in Santa Clara, California, and is the worlds largest semiconductor chip maker, based on revenue. They invented the  serial x86 microprocessors these processors are found in most personal computers (Intel, 2010, para. 20). Intel along with other competing companies is predicting no more mouse or keyboards by 2020. Right now with Intel-developed sensor and  mentality waves scientist are hoping they can find ways to  govern brain waves to operate computers.This all would be done of course with consumers permission. Scientists believe that consumers would want the freedom gained by using the implant. The  topic may be far-fetched now but 20 years ago  give tongue to a person that it would become almost necessary to carry a computer around that idea would have been rebutted. Look around now, people cannot leave a computer or computer device home or even in a vehicle w   ithout feeling  manage something is missing, an almost naked feeling. Scientists believe that consumers will grow tired of  colony of computer interface.Whether its fishing out accessories or even just using the hands to interact, Scientists think consumers would prefer to manipulate various devices with their brains.  presently a research team from Intel is working on decoding  human brain activity. The team has used Functional Magnetic Resonance  imagery (FMRI), these are machines that determine blood flow changes in certain areas of the brain based on what word or image the consumer is thinking of. This idea sounds farfetched but almost two years ago, scientist in the U. S. and  lacquer announced that a monkeys brain was used to  delay a humanoid robot.Scientist and the Intel team are currently working on getting to a point where it is possible to mentally type  lecture by thinking about letters (Intel Chips in brains will control computers by 2020, 2010, para. 4). The story of t   he computer is amazing to see how far technology has come is almost unreal. Evolving from the first computer the ENAIC, a huge machine that had thousands of tubes everywhere computers are now small enough to be placed in a brief case for on the go use. Furthermore, with the everyday advancement of technology it wont be long before farfetched ideas become a reality.  
Subscribe to:
Post Comments (Atom)
 
 
No comments:
Post a Comment