What is Computer language and its type? Generations of Computer: 1G to 5G

What is Computer Language?

 Languages are the way to explain things in an easy and effective way. Our computer also understand language which is in form of bits (i.e. 0 and 1) also known as machine language. But these bits are in form of 0 and 1 are not easily understood by humans so every code or syntax or problem is first written in the human-readable form then, it is converted to machine language using compiler or assembler. This human-readable form is known as high-level language. 

Type of Computer languages:

There are mainly three types of computer language i.e. machine language, assembly language, and high-level language.

Assembly language: 

An assembly language (is often abbreviated as asm) is a low-level programming language designed for a specific type of processor. It may be produced by compiling source code from a high-level programming language (such as C/C++) but can also be written from scratch. Assembly code can be converted to machine code using an assembler. Today, it is primarily used for direct hardware manipulation, access to specialized processor instructions, or to address critical performance issues. Typical uses are device drivers, low-level embedded systems, and real-time systems.

Machine language:

Machine language, or machine code, is a low-level language comprised of binary digits (ones and zeros). High-level languages, such as Java, Python, PHP, Swift, and C++ must be compiled into machine language before the code is run on a computer. Since computers are digital devices, they only recognize binary data.

High-level languages:

High-level languages allow programmers to write instructions in a language that is easier to understand than low-level languages. Translators are needed to translate programs written in high-level languages into the machine code that a computer understands. Some of the examples are Python, Visual Basic, Delphi, Perl, PHP, ECMAScript, Ruby, C#, Java and many others.

Generations of Computer: 1G to 5G 

Computer is a electronics devices which is used to perform certain task such as arithmetic operation, logical operation or some programming task. Computer evolved over the period of time. As the new technology came in, computer became efficient as well as smart in doing work. There are mainly 5 generation of computer evolution. These are as follows:

 1st Generation Computer (1940 – 1956):

In first generation computer, vacuum tubes as circuitry and magnetic drums for memory. This generation computer relied on ‘machine language’ (the most basic programming language understood by computers). These computers were limited to solve only one problem at a time. Input are given through punched cards and paper tape. Output came in form of print-outs. The two notable machines of this generation were the UNIVAC and ENIAC machines – the UNIVAC is the first ever commercial computer.

 2nd Generation Computer (1956 – 1963):

In this generation, vacuum tubes are replaced by transistors. Although transistors are first invented in 1947, they weren’t used significantly in computers until the end of the 1950s. The language evolved from machine language to symbolic (‘assembly’) languages. This meant programmers could create instructions in words. About the same time high level programming languages were also developed such as COBOL and FORTRAN. Transistor-driven machines were the first computers to store instructions into their memories  and moving from magnetic drum to magnetic core ‘technology’.

3rd Generation Computer (1964 – 1971):

In this generation, transistors were now being miniaturized and put on silicon chips (called semiconductors) commonly termed as integrated circuit. This led to a massive increase in speed and efficiency of these machines. This type of computers can interact using keyboards and monitors which interfaced with an operating system. This lead to a significant jump up from the punch cards and printouts.

4th Generation Computer (1972 – 2010):

In this generation, microprocessor are introduced. One microprocessor consist of thousands of integrated circuits on a single chip.  Intel is the first company to use this type of technology. The chip-maker developed the Intel 4004 chip in 1971, which positioned all computer components (CPU, memory, input/output controls) onto a single chip. What filled a room in the 1940s now fit in the palm of the hand. Some of the other major advances of this period is the use of Graphical user interface (GUI), the mouse and more recently the astounding advances in laptop capability and hand-held devices.

5th Generation Computer (2010-Till now) :

In this generation of computer, devices with integrated with artificial intelligence (AI), some of features of AI are beginning to emerge and be used in applications such as voice recognition.AI made things possible by using parallel processing and superconductors. Leaning to the future technologies, computers will be further transformed to quantum computation, molecular and nano-technology. 

      <<<<<Basic of computer: LAN, WAN and MAN>>>>>>

Post a Comment

1 Comments