UGC NET Exam  >  UGC NET Notes  >  Information and Communication Technology (ICT) for UGC NET  >  Notes: Data Representation, Software & Programming language

Notes: Data Representation, Software & Programming language | Information and Communication Technology (ICT) for UGC NET PDF Download

Data Representation

It refers to those methods which are used internally to represent information stored in a computer.

Notes: Data Representation, Software & Programming language | Information and Communication Technology (ICT) for UGC NET

Number System

It defines a set of values that is used to represent quantity. Digital computers internally use the binary number system to represent the data and perform arithmetic calculations.

 Data representation encompasses methods utilized internally to depict information held within a computer system. The number system is a foundational concept defining values for representing quantity, with digital computers leveraging the binary system for data representation and arithmetic operations. Among the various number systems, the Binary Number System, characterized by the exclusive use of 0s and 1s, stands out as highly efficient for computer operations, albeit less intuitive for human comprehension. This system, also known as the Base 2 System, employs binary digits (bits) 0 and 1, serving as the fundamental unit for computer calculations.

Each position of the hexadecimal number represents a successive power of 16.

Decimal, Binary, Octal, and Hexadecimal Equivalents:

  • Decimal: 0, Binary: 0000, Octal: 0, Hexadecimal: 0
  • Decimal: 1, Binary: 0001, Octal: 1, Hexadecimal: 1
  • Decimal: 2, Binary: 0010, Octal: 2, Hexadecimal: 2
  • Decimal: 3, Binary: 0011, Octal: 3, Hexadecimal: 3
  • Decimal: 4, Binary: 0100, Octal: 4, Hexadecimal: 4
  • Decimal: 5, Binary: 0101, Octal: 5, Hexadecimal: 5
  • Decimal: 6, Binary: 0110, Octal: 6, Hexadecimal: 6
  • Decimal: 7, Binary: 0111, Octal: 7, Hexadecimal: 7
  • Decimal: 8, Binary: 1000, Octal: 10, Hexadecimal: 8
  • Decimal: 9, Binary: 1001, Octal: 11, Hexadecimal: 9
  • Decimal: 10, Binary: 1010, Octal: 12, Hexadecimal: A
  • Decimal: 11, Binary: 1011, Octal: 13, Hexadecimal: B
  • Decimal: 12, Binary: 1100, Octal: 14, Hexadecimal: C
  • Decimal: 13, Binary: 1101, Octal: 15, Hexadecimal: D
  • Decimal: 14, Binary: 1110, Octal: 16, Hexadecimal: E
  • Decimal: 15, Binary: 1111, Octal: 17, Hexadecimal: F

Conversion Between the Number Systems:

Different types of conversion between the number systems are discussed below:

  1. Decimal to Binary: To convert decimal to binary, the following steps are involved:
  • Divide the given number by 2.
  • Note the quotient and remainder.
  • Remainder should be 0 or 1.
  • If quotient ≠ 0, then again divide the quotient by 2 and go back to step 2.
  • If quotient = 0, then stop the process.
  • First remainder is called the 'Least Significant Bit' (LSB) and the last remainder is called the 'Most Significant Bit' (MSB).
  • Arrange all remainders from MSB to LSB.

Example:

Remainder431
2431
2211
2100

This HTML representation breaks down the steps clearly in a list format and provides an example table for better understanding.

2. Binary to Decimal Conversion Process:

  • Multiply each binary digit by powers of 2.
  • For the integral part, use positive powers; for the fractional part, use negative powers.
  • Add all the multiplying digits.
Example:

(1101.10)₂ → (13.5)₁₀

(1101.10)₂ = (1 × 2³) + (1 × 2²) + (0 × 2¹) + (1 × 2^0) + (1 × 2^(-1)) + (0 × 2^(-2)) = 13.5

Computer Codes:

Computers use codes to represent binary and decimal numbers.

ASCII-7:

  • 7-bit code with 0 to 127 unique symbols.
  • ASCII-8:
    • An extended version of ASCII-7.
    • 8-bit code, allowing 0 to 255 unique symbols or characters.
  • EBCDIC:
    • In EBCDIC (Extended Binary Coded Decimal Interchange Code), characters are represented by 8 bits.
    • These codes store information readable by other computers.
    • It allows 0 to 255 combinations of bits.

Software

Software refers to a collection of computer programs and related data that provide instructions for telling a computer what to do and how to do it. It acts as an interface between the user and the computer.

Notes: Data Representation, Software & Programming language | Information and Communication Technology (ICT) for UGC NET

 Software can be categorized into various types:

1. System Software

System software helps the computer manage its internal resources and includes several programs responsible for controlling, integrating, and managing the individual hardware components of a computer system. Common examples include:

(i) Operating System (OS)

An Operating System (OS) is a type of software or series of programs that manage and organize files, perform process management, memory management, file management, and input/output management. It is the first program that runs when the computer boots up. Operating systems can be classified as follows:

  • Single-user: Allows only one user at a time (e.g., MS-DOS, Windows XP).
  • Multi-user: Allows multiple users to run programs simultaneously (e.g., Unix, Linux, Windows 2000/7).
  • Multi-tasking: Allows more than one program to run concurrently (e.g., Linux, Unix, Windows 95).
  • Multi-processing: Supports running a program on more than one CPU (e.g., Unix, Windows NT/2000).
  • Real-time: Used for real-time applications like satellite launch and weather forecasting (e.g., Linux, HP-RT).

(ii) Device Drivers

Device drivers are software written to make a device functional when connected to the computer. Each device, such as a printer, monitor, mouse, or keyboard, has an associated driver program for proper functioning.

2. System Utilities

System utilities support, enhance, expand, and secure existing programs and data in the computer system. Main functions include:

  • Disk Compression: Increases the amount of information stored on secondary storage devices by compressing all data on a hard disk.
  • Disk Fragmenter: Rearranges files and unused space on storage devices.
  • Backup Utilities: Create copies of all information stored on a disk and can restore the entire disk or selected files.
  • Disk Cleaners: Find and remove files that have not been used for a long time to increase the speed of a slow computer.
  • Anti-virus: Scans the computer for viruses and prevents files from being corrupted (e.g., Norton, Quick Heal).

3. Application Software

Application software is designed to help users perform singular or multiple tasks. These programs are also known as end-user programs. They can be classified into:

(i) General Purpose Software

These are used for general functions and allow users to perform common tasks. Examples include word processing software, electronic spreadsheets, DBMS, desktop publishing, and multimedia software.

(ii) Specific Purpose Software

These are designed to execute specific tasks, such as inventory management, payroll, hotel management, reservation systems, report card generators, billing systems, and HR management systems.

4. Open Source Software

Open source software refers to software whose source code is publicly accessible, allowing anyone to modify and share it. Key criteria for open source software include:

  • The software must be available for free or at a low cost.
  • Source code must be included.
  • Anyone must be allowed to modify the source code.
  • Modified versions can be redistributed.

Examples of open source software include Linux, Open Office, Apache HTTP Server, Mozilla Firefox, Google Chrome, and Python.

Programming Language

A programming language is a set of keywords, symbols, and rules used to construct statements that allow humans to communicate instructions for a computer to execute. Notes: Data Representation, Software & Programming language | Information and Communication Technology (ICT) for UGC NET

Programming languages can be broadly categorized into three types:

1. Low-Level Language

Low-level languages are designed to operate and manage the entire instruction set of a computer system, directly interacting with hardware. They are further divided into two types:

(i) Machine Language

Machine language, also known as machine code or object code, is a collection of binary digits (0s and 1s) that the computer's central processing unit (CPU) reads and interprets. It is the most fundamental level of programming and is specific to the architecture of the computer's processor.

(ii) Assembly Language

Assembly language is one step above machine language. It uses symbolic names (mnemonics) instead of binary codes, making it easier for humans to read and write. Assembly language is used to interface with computer hardware directly and requires an assembler to convert it into machine code.

2. Medium-Level Language

Medium-level languages act as a bridge between low-level hardware operations and high-level programming. They combine the elements of both low-level and high-level languages, enabling efficient interaction with hardware while providing more abstraction than low-level languages. An example is the C language, which allows direct manipulation of hardware addresses and memory but also provides high-level constructs for easier programming.

3. High-Level Language

High-level languages are designed to be easy for humans to read and write. They are several steps removed from the actual machine code processed by the computer's CPU. High-level languages abstract the complex details of the computer's hardware, enabling programmers to focus on logic and functionality. Examples of high-level languages include BASIC, C, FORTRAN, Java, and Pascal.

Language Translator

Language translators are essential tools that convert programs written in various programming languages into machine language, producing what is known as object code. 

Notes: Data Representation, Software & Programming language | Information and Communication Technology (ICT) for UGC NETThere are three primary types of language translators:

1. Assembler

An assembler converts assembly language into machine language (binary code). Assembly language consists of mnemonic codes that are difficult to learn and are machine-dependent. The assembler translates these mnemonics into the corresponding machine code that the CPU can execute.

2. Compiler

A compiler translates source code written in a high-level language into machine language. It reads the entire source code in a single run, identifying and reporting any errors to the programmer. Once the code is error-free, the compiler generates the machine code, which can be executed by the computer.

3. Interpreter

An interpreter converts a high-level language program into machine language line-by-line. It reads and executes each line of code sequentially, reporting errors as they occur. This allows for immediate debugging and testing of code, but it generally runs slower than compiled code because it translates each line every time the program is run.

Examples and Elaborations

  • Machine Language: Machine language instructions might look like "10110000 01100001", which is difficult for humans to read but can be executed directly by the CPU.
  • Assembly Language: An assembly language instruction might be "MOV AL, 61h", which moves the hexadecimal value 61 into the AL register. This is easier to understand than binary machine code.
  • High-Level Language: A high-level language statement might be "print('Hello, World!')", which is intuitive and easy to understand, showing how high-level languages abstract away the complexity of machine operations.
  • Compiler Example: A C program written with a "printf" statement is compiled using a C compiler, which translates the entire program into executable machine code before running it.
  • Interpreter Example: A Python program with a "print" statement is interpreted line-by-line, allowing immediate execution and debugging.
The document Notes: Data Representation, Software & Programming language | Information and Communication Technology (ICT) for UGC NET is a part of the UGC NET Course Information and Communication Technology (ICT) for UGC NET.
All you need of UGC NET at this link: UGC NET
30 videos|17 docs|8 tests

Top Courses for UGC NET

FAQs on Notes: Data Representation, Software & Programming language - Information and Communication Technology (ICT) for UGC NET

1. What is data representation and why is it important in computer science?
Ans. Data representation refers to the way data is stored and processed in a computer system. It is important in computer science because different data representations have different advantages and limitations in terms of efficiency, storage requirements, and ease of manipulation.
2. What are the different number systems used in computer science and how are they represented?
Ans. The main number systems used in computer science are binary, decimal, octal, and hexadecimal. Binary is represented using 0s and 1s, decimal uses the digits 0-9, octal uses digits 0-7, and hexadecimal uses digits 0-9 and letters A-F.
3. How does software differ from programming languages in computer science?
Ans. Software refers to programs and applications that run on a computer system, while programming languages are used to write and create those software programs. Software is the end product that users interact with, while programming languages are the tools used to create that software.
4. What is the role of a language translator in computer programming?
Ans. A language translator is a program that converts code written in a high-level programming language into machine code that can be executed by a computer. This translation process allows programmers to write code in a more human-readable format while still being able to run it on a computer.
5. How does understanding data representation, software, and programming languages benefit computer science students and professionals?
Ans. Understanding data representation, software, and programming languages is essential for computer science students and professionals as it allows them to design efficient algorithms, develop robust software applications, and troubleshoot code effectively. It helps in building a strong foundation in computer science principles and practices.
30 videos|17 docs|8 tests
Download as PDF
Explore Courses for UGC NET exam

Top Courses for UGC NET

Signup for Free!
Signup to see your scores go up within 7 days! Learn & Practice with 1000+ FREE Notes, Videos & Tests.
10M+ students study on EduRev
Related Searches

Objective type Questions

,

pdf

,

video lectures

,

past year papers

,

Notes: Data Representation

,

Software & Programming language | Information and Communication Technology (ICT) for UGC NET

,

Previous Year Questions with Solutions

,

practice quizzes

,

shortcuts and tricks

,

Exam

,

Software & Programming language | Information and Communication Technology (ICT) for UGC NET

,

Free

,

Summary

,

MCQs

,

Software & Programming language | Information and Communication Technology (ICT) for UGC NET

,

Extra Questions

,

Notes: Data Representation

,

mock tests for examination

,

study material

,

Important questions

,

Notes: Data Representation

,

Sample Paper

,

Viva Questions

,

ppt

,

Semester Notes

;