0% found this document useful (0 votes)
139 views28 pages

Linux Programming Unit 1

Linux is a free and open-source operating system originally created as a free version of UNIX. It has a modular kernel and is highly portable, running on everything from embedded devices to supercomputers. The Linux operating system architecture consists of the kernel, system libraries, system utilities, hardware layer, shell, and file system hierarchy. Common file handling utilities in Linux include commands like cp for copying, mv for moving and renaming, and ln for creating links between files.

Uploaded by

G.pradeep Reddy
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
Download as doc, pdf, or txt
0% found this document useful (0 votes)
139 views28 pages

Linux Programming Unit 1

Linux is a free and open-source operating system originally created as a free version of UNIX. It has a modular kernel and is highly portable, running on everything from embedded devices to supercomputers. The Linux operating system architecture consists of the kernel, system libraries, system utilities, hardware layer, shell, and file system hierarchy. Common file handling utilities in Linux include commands like cp for copying, mv for moving and renaming, and ln for creating links between files.

Uploaded by

G.pradeep Reddy
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1/ 28

Unit -1

Linux Programming
History of Linux: Architecture & Components of Linux System:
• Linux is a Unix clone written from scratch by Linus
Torvalds with assistance from a loosely-knit team of
hackers across the Net.
• Unix is a multitasking, multi-user computer
operating system originally developed in 1969 by a
group of AT&T employees at Bell Labs.
• Linux and Unix strive to be POSIX compliant.
• POSIX an acronym for "Portable Operating System
Interface", is a family of standards specified by the
IEEE for maintaining compatibility between
operating systems.
 Development started in 1991
◦ Linus Torvalds wanted to create a free
implementation of UNIX
◦ By 1993 there were 12000 Linux users
◦ Today Linux rivals UNIX in stability and
scalability
• Linux is a UNIX clone
– It can run on 32 bit and 64 bit hardware
– Linux is a true multitasking environment
– Fully capable of taking advantage of
multiple processors
– Can address up to 64 GB of RAM
– Partial POSIX Compliance
Features of Linux:
 Linux is free
o Anyone can download and compile the The Linux Operating System’s architecture primarily has
these components: the Kernel, Hardware layer, System
source
library, Shell and System utility.
o The code can be modified by anyone
• The kernel is the core part of the operating system,
provided the modifications are released to
which is  responsible for all the major activities of
the community
the LINUX operating system. This operating system
 Linux is not an Operating System
consists of different modules and interacts directly
 Linux is a kernel
with the underlying hardware. The kernel offers the
 A kernel is a program that allocates and controls
required abstraction to hide  application programs or
hardware resources in a system
low-level hardware details to the system. The types
 Linux Distrobutions use the Linux kernel together
of Kernels are as follows:
with the GNU Operating System
• Monolithic Kernel
 Portable(Multiplatform)
• Micro kernels
 Multitasking
• Exo kernels
 Multi User
• Hybrid kernels
 Multiprocessor (SMP) Support
2. System libraries are special functions, that are used to
 Multithreading Support
implement the functionality of the operating system and do
 Virtual Memory
not require code access rights of kernel modules.
 Hierarchical File System
3. System Utility programs are liable to do individual, and
 Graphical User Interface (X Window System,
specialized-level tasks.
Wayland)
4. Hardware layer of the LINUX operating system consists
 Wide Hardware Support
of peripheral devices such as RAM, HDD, CPU.
 Dynamically Linked Shared Libraries as well as
5. The shell is an interface between the user and the kernel,
Static Libraries
and it affords services of the kernel. It takes commands from
 POSIX Compliant (Almost)
the user and executes kernel’s functions. The Shell is present
 Multiple Virtual Consoles
in different types of operating systems, which are classified
 Multitple Filesystem Support
into two types:command line shells and graphical shells.
 Multiple Networking
 Shell

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 1


Unit -1
Linux Programming
 Strong Security Model • /Bin: contains executable files for most of the unix
The command line shells provide a command line interface, commands.
while the graphical line shells provide a graphical user • /Dev: contain files that control various input &
interface. Though both shells perform operations, but the output devices.
graphical user interface shells perform slower than the • /Lib: contains all the library functions in binary
command line interface shells. Types of shells are classified form.
into four: • /Usr: contains several directories each associated
 Korn shell with a particular user.
 Bourne shell • /Tmp: contain the temporary files created by unix or
 C shell by any user.
• /Etc: contains configuration files of the system.
 POSIX shell • /home : contains user directories and files
Versions and Flavors’ of Linux: • /lib : contains all library files
1. Ubuntu
2. Fedora • /mnt : contains device files related to mounted
3. Linux Mint devices
4. openSUSE
5. PCLinuxOS • /proc : contains files related to system processes
6. Debian
7. Mandriva/Samba • /root : the root users' home directory (note this is
8. Sabayon/Gentoo different than /)
9. Arch Linux... plus Slackware
10. Puppy Linux... plus DSL • /sbin : system binary files reside here. If there is no
File Handling Utilities: sbin directory on your system, these files most likely
reside in etc
These are the Linux commands which help you to create,
delete, rename, move, copy, edit and perform other related
activities on Linux files.  To Linux, a file is a named Cp (Copying Files): To create an exact copy of a file you can
collection of related data that appears to the user as a single, use the “cp” command. The format of this command is:
contiguous block of information and that is retained in
storage. cp [-option] source destination
A simplified UNIX directory/file system: Example : Cp file1 file2
Here file1 is copied to file2.
Example : Cp file1 file2 dir
File1 file2 are copied to dir.

Cp turns to interactive when –i option is used & destination


file also exists.
$cp -i file1 file2
overwrite file2 (yes/no)?
Y at this prompt overwrites the file.
mv (Moving and Renaming Files):Used to rename the
files/directories.
$mv [-option] source destination
File system consists of files, relationships to other files, as $mv test sample
well as the attributes of each file. File attributes are Here test is renamed as sample.
information relating to the file, but do not include the data ln (link): Used to create links (both soft & hard links).
contained within a file. File attributes for a generic operating It creates the alias & increase the link count by one.
system might include (but are not limited to): $ln file1 file2
ln won’t work if the destination file also exists.
 a file type (i.e. what kind of data is in the file) Rm (Deleting Files and Directories):
 a file name (which may or may not include an To delete or remove a file, you use the “rm” command. For
extension) example,
$rm my. listing
 a physical file size will delete “my.listing”.
With –i option removes the files interactively.
$rm –i file1

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 2


Unit -1
Linux Programming
 a file owner With –r option recursively removes directories.
$rm –r dir1
 file protection/privacy capability

 file time stamp (time and date created/modified) Abbreviations used by chmod:

mkdir: used to create one or more directories.


$mkdir book
Creates the directory named book. Permissions:
$mkdir dbs doc dmc r-read permission
Creates three directories. w-write permission
Rmdir: remove directories x-execute permission
• Removes empty directories. Absolute assignment:
• $rmdir book Absolute assignment by chmod is done with the = operator.
removes directory named book if it is empty. Unlike the + or – operator s, it assigns only those permissions
• $rmdir dbs doc dmc that are specified along with it and removes other
Removes 3 directories. permissions.
find: It recursively examines a directory tree to look for If u want to assign only read permission to all three
matching some criteria and then takes some action on the categories and remove all other permissions from the file
selected files. small use
Syntax: chmod g-wx,o-x small
• find path_list selection_criteria action Or simply use = operator in any of the following ways.
• To locate all files named a. out use chmod ugo=r small
$find / -name a. out –print chmod a=r small
• ‘/’ indicates search should start from root directory. Chmod =r small
• To locate all c files in current directory chmod 644 foo.txt
$find . -name “*.c” –print here 644 stands for users groups and others
• To find all files begin with an uppercase letter use user having read and write , groups and others having read
$find . –name ‘[A-Z]*’ –print only permissions.
Find operators: The octal notation:
Find uses 3 operators Chmod also takes a numeric argument that describes both the
!,-a ,-o category and the permission. The notation uses octal numbers.
Security by file permissions: Each permission is assigned a number like
Unix follows a 3-tiered file protection system. read permission-4, write permission-2, execute permission-1
Process utilities:
A process is a sequence of instructions and each process has a
block of controlled data associated with it. Processes can be
manipulated in a way similar to how files can be
manipulated. 
• Ps with –f option displays a fuller listing that
includes the PPID.
• Ps with –u option followed by user-id displays the
processes owned by the user-id.
Each group represents a category. There are 3 categories-
• Ps with –e option displays the system processes.
owner ,group ,others.
A Process contains:
• Each category contains read ,write ,execute
 Process Id
permissions .
• rwx->presence of all permissions.  Parent Process ID
• r-x->absence of write permission  State of the processes
• r-- -> absence of write ,execute permission  CPU usage
• Chmod: changing file permission Options of PS Command :
chmod sets a file’s permissions (read, write and execute) for 1.-a : This option shows all running processes on terminal
all three categories of users (owner, group and others) with exception of group leaders.
Syntax: 2.-u : This options fetches the information of user and what
chmod category operation permission file(s) user is doing.
The command contains three components: 3.-c : It displays Scheduler data

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 3


Unit -1
Linux Programming
• category of a user (owner, group or others) 4.-d : This option displays all processes with exception of
• operation to be performed (assign or remove a session leaders.
permission) 5.-f : It displays full listing of processes.
• Permission type (read, write or execute)
The above command will set current date and time to the file
named Shyam. You can check the file date time modified by
6.-e :It gives the every process running on that terminal using ls command:
7.-j : It is used to display group id and session ids of the LS -ltr
process Output :
8.-l : This option displays long listing Shyam 03-may-2017 09 : 46 : 45

$ ps –a Change the modification time :


If user wants to change only modification time of file and
The above command is used to find out which processes are access time remains as it is then user needs to use touch
running for the other users who have logged in. command with -m option.
To see what particular user is doing : Syntax :
$ ps – u $touch -m Filename
user1 to see what a particular user is doing. u stands for user Example :
and user1 is the username. $touch -m Shyam
Finding process from particular terminal : The above command will change the modification time of file
$ ps –t named ‘Shyam’.
The ps -t the command used to find out the processes that Avoid Creating new file :
have been launched from a particular terminal.om particular To avoid creating a new file if it does not exist user can use -c
terminal . option of touch command.
To Find out additional process information : Syntax :
$touch -c Filename
$ ps –f gives additional information about process such as Example :
Parent process ID start time, name of process running. The $touch -c Shyam
sched is the father of all processes and this gets launched The above command will avoid creating a new file if file
when the machine is booted. ‘Shyam’ does not exist.
Touch Command : Explicitly Set Access and Modification Time :
If user wants to change the timestamp , or create or modify User can explicitly set access and modification time with
timestamp Touch command is used. Touch command is used specified date format.
to work with server Operating system timestamp.Following Syntax :
are some important options of Touch command : $touch -c -t [Format needed] Filename
Syntax : Example :
$touch [options] Filename. $touch -c -t YYDDHHMM Shyam
-t : This option is used to create a file in specified $touch -c -t 17101010 Shyam
timeProcess Commands If you want to see the time format of the same file :
-r :This option is used to access and modify the time $ls -i
-m : This option is used to change the modification time Output:
-a : This option is used to change access time only Shyam May 03 10:10
-c : If file does not exist do not create it. Use the Format from other file :
-d : This option is used to access and modification time User can copy the timestamp format of other file and set it to
syntax : his own file also by using -r option.
$touch File1…File2…File N Syntax :
Touch command is used to create single empty file or $Touch -r File1 File2
multiple empty files. Example :
Example : $Touch -r Amit Shyam
$touch Amit Shyam The Access time/modification time format of Amit has been
Change File access time : assigned to the file named Shyam.
Touch command is used to change access and modification
time.User can use -a option to change the access time and Kill:
modification time in touch command.-a option of touch  $ kill Pid:  When invoked kill command sends a termination
command will set the current date and time to the file. signal to the process being killed. We can employ sure kill
signal to forcibly terminate a process. Signal number for sure

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 4


Unit -1
Linux Programming
Syntax : kill is 9
$touch -a Filename
Example :
$touch -a Shyam Fields described by ps are described as:
UID: User ID that this process belongs to (the person running
it)
Syntax : PID: Process ID
$ kill Option PPID: Parent process ID (the ID of the process that started it)
Example : C: CPU utilization of process
$ kill -9 2398 STIME: Process start time
A process can be run in two ways: TTY: Terminal type associated with the process
1.Foreground Process : Every process when started runs in TIME: CPU time taken by the process
foreground by default, receives input from the keyboard and CMD: The command that started this process
sends output to the screen. nice command
When issuing pwd command The command nice is used to run a program with modified
2.$ ls pwd scheduling priority for priority scheduling by assigning the
Output: priorities to the processes. The syntax is
$ /home/geeksforgeeks/root $nice [OPTION] [COMMAND [ARG]...]
When a command/process is running in the foreground and is We run this COMMAND with an adjusted niceness, which
taking a lot of time, no other processes can be run or started affects process scheduling. With no COMMAND, print the
because the prompt would not be available until the program current niceness. Nicenesses range from -20 (most
finishes processing and comes out. favorable scheduling) to 19 (least favorable).
3.Backround Process : It runs in the background without The options listed in a table.
keyboard input and waits till keyboard input is required. options Description
Thus, other processes can be done in parallel with the process
running in background since they do not have to wait for the -n, add integer N to the niceness (default
previous process to be completed. --adjustment=N 10)
Adding & along with the command starts it as a background --help display this help and exit
process
4.$ pwd & --version output version information and exit
Since pwd does not wants any input from the keyboard, it pradeep@varisha:~$ nice
goes to the stop state until moved to the foreground and given 0
any data input. Thus, on pressing Enter, : it means that currently no priorities assigned to processes.
Output: Disk utilities:
[1] + Done pwd df: displays the amount of free space available on the disk.
$ The output displays for each file system separately.
That first line contains information about the background $df
process – the job number and the process ID. It tells you that $df [OPTION]... [FILE]...
the ls command background process finishes successfully. du –disk Usage
The se The second is a prompt for another command. The command du summarize disk usage of each FILE,
Tracking ongoing processes recursively for directories.
ps (Process status) can be used to see/list all the running Syntax:
processes.
$ ps $du [OPTION]... [FILE]...
OR
PID TTY TIME CMD $du [OPTION]... --files0-from=F
19 pts/1 00:00:00 sh
24 pts/1 00:00:00 ps Mount:Used to mount the file systems.
For more information -f (full) can be used along with ps Takes 2 arguments-device name ,mount point.
$ ps –f • Mount uses an option to specify the type of file
system.
UID PID PPID C STIME TTY TIME CMD • To mount a file system on the /oracle directory on
52471 19 1 0 07:20 pts/1 00:00:00f sh Linux system use
52471 25 19 0 08:04 pts/1 00:00:00 ps -f $mount –t iso9660 /dev/cdrom /mnt /cdrom
For a single process information, ps along with process id is $mount –t msdos /dev/fd0 /floppy
used

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 5


Unit -1
Linux Programming
$ ps 19

PID TTY TIME CMD


19 pts/1 00:00:00 sh Ping:
For a running program (named process) Pidof finds the The ping command sends echo request packets to a network
process id’s (pids) host to see if it is
accessible on the network. The syntax is
Umount: unmounting file systems $ping [-R] [-c number] [-d] [-i seconds] host
• Unmounting is achieved with the umount command. We can use twoflags to know about the packets transfer in the
which requires either file system name or the mount network. They are
point as argument. Flag or Description
• $umount /oracle Options
• $umount /dev/hda3 -c number Stops sending packets after the specified
• Unmounting a file system is not possible if the file is number of packets have been sent.
opened.
ulimit: user limit
• It contains a value which signifies the largest file -d Outputs packets as fast as they come back or
that can be created by the user in the file system. 100 times per second. The greater number
• When used by itself it displays the current setting. will be generated
• $ulimit unlimited The option (-d )can only be used by the root user because it
User can also set the ulimit value by using can generate extremely high volumes of network traffic.
$ulimit 10 For example
unmask: $ping 152.106.50.27
When u create files and directories, the default permissions Netstat:
that are assigned to them depend on the system’s default The netstat command displays network status information,
setting. Actually this default is transformed including connections, routing tables, and interface statistics.
By subtracting the user mask from it to remove one or more When no options are provided, a list of active
permissions. This value is evaluated by umask without sockets will be displayed. The syntax is
arguments. $netstat [-Mnrs] [-c] [-i interface] [--interface interface] [--
$umask masquerade]
022 [--route] [--statistics]
Basic networking commands: Options Description
 ifconfig -c Displays the select information and Options
 ping every second until Ctrl-C interrupts it.
 netstat -i [interface]/--interface [interface] Displays information
 mailto about a specified interface, or all interfaces if none is
 nslookup specified.
 dnsdomainname -M/--masquerade Displays a list of masqueraded sessions.
 route -n Shows numerical addresses instead of their
 traceroute host, port, or user names.
 hostname For example
 rlogin pradeep@pradeep:~$ netstat -v
 rdate unix 3 [] STREAM CONNECTED
8493
 telnet
mailto:
 ftp
The mailto command sends an e-mail to one or more
ifconfig:
recipients. If no recipients are indicated on the command line,
The ifconfig command configures a network interface, or
the user will be prompted for the recipients. If no standard
displays its status if no options are provided. If no arguments
input is provided, then the user is prompted for the content of
are provided, the current state of all interfaces is displayed.
the message.
The syntax is
The syntax is
$ifconfig interface options address
$mailto [-a character-set] [-c address,]
where interface specifies the name of the network
[-s subject] [recipient ...]
interface (e.g. eth0 or eth1).
To get the IP address of the host machine we enter at the
command prompt the command

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 6


Unit -1
Linux Programming
$ ifconfig
This provides us with the network configuration for the host
machine consists of Ethernet adapter , IP address ,Subnet
Mask and Default Gateway. Traceroute:
The traceroute command displays the route a packet travels to
reach a remote host on the network. The syntax is
Flags Description $traceroute [-r] host
-a character-set Specifies and alternate character set, :~$ traceroute google.com
such as ISO-8859-8. The default is traceroute to google.com (173.194.36.40), 64 hops max
US-ASCII. 10.4.16.1 (10.4.16.1) 0.609ms 0.625ms 0.654ms
-c address Specifies carbon-copy addresses. hostname
-s subject Specifies the subject line of the The hostname command displays or sets the
message. system’s host name. If no flags or arguments are given, then
the host name of the system is displayed. The syntax is
> hostname [-a] [--alias] [-d] [--domain] [-f] [--fqdn] [-i] [--
ip-address] [--long] [-s]
[--short] [-y] [--yp] [--nis]
Important Flags For example
If the subject is more than one word, enclose it in quotation deepu@deepu:~$ hostname
marks. To finish composing a message, use Ctrl-D or type a . raaz
alone on a blank line. rdate
nslookup: The rdate command retrieves the current time from one or
The nslookup command queries a DNS name server. It can more hosts on the network and displays the returned time.
be run in interactive mode. If no host name is provided, then The syntax is
the program enters interactive mode. By default, the DNS $rdate [-p] [-s] host ...
server specified in /etc/resolv.conf is used unless another is -p->Displays the time returned from the remote system in
specified. If we want to specify a server but not look up a system default behavior.
specified host, we must provide a - in place of the host. -s->Sets the local system’s time based on the time retrieved
nslookup is a program to query Internet domain name from the network. This can only be used by the root user.
servers. Nslookup has two modes: interactive and non- deepu@deepu:~$ rdate
interactive. Usage: rdate [-46acnpsv] [-o port] host
Interactive mode allows the user to query name servers for -4: use IPv4 only
information about various hosts and domains or to print a list -6: use IPv6 only
of hosts in a domain. Non-interactive mode is used to print telnet: Remote login
just the name and requested information for a host or domain. If u have an account on the host in a local network (or on
The syntax internet ),u can use this with the host name or the ip address
$nslookup [-option] [name | -] [server] as argument.
Dnsdomainname: $telnet Saturn
The dnsdomainname command displays the system’s DNS Trying to 192.168.0.1…
domain name based on its fully qualified domain name. The Connected to Saturn
syntax is • Login:----
$ dnsdomainname • Password:-----
For example: $dnsdomainname • U can quit telnet by using exit command.
jntua.ac.in • telnet prompt:
route: When telnet used without Ip address the system displays a
The route command displays or alters the IP routing table. telnet> prompt . U can invoke a login session from here with
When no options are provided, the routing table is displayed. open.
The syntax is telnet> open 192.168.0.8
$route add [-net|-host targetaddress [netmask Nm] [gw Gw] Trying to 192.168.0.8…
[[dev] If] Connected to 192.168.0.8
$ route del [-net|-host] targetaddress [gw Gw] [netmask Nm]
[[dev] If]
tee:
ftp: tee is an unusual filter because it doesn't change lines from
file transfer protocol stdin at all. But, is used to keep a record part-way through a
• ftp is used to transfer files. It can be used pipeline. This can be used to copy the contents of stdin to

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 7


Unit -1
Linux Programming
with host name. stdout. But, it creates a copy on a file.
$ftp Saturn Syntax:
Connected to Saturn $tee [options] file ...
• ftp>binary Options Description
200 type set to I -a, --append append to the given FILEs, do not
• ftp>put penguin. gif overwrite
To copy multiple files use mput. -i, --ignore- ignore interrupt signals
• ftp>mput t*.sql interrupts
• Downloading files: get & mget --help display this help and exit
• To download the files from remote machine use get --version output version information and exit
& mget. Example:
• ftp>get ls-lR.gz $ prog file1 file2 | filter1 | \ tee spy | filter2
Filters: It copies its stdin to both stdout and to the file spy.
Filters are building block programs that read from standard Note: The special operator \ is used for continuation onto
input stream and write to standard output stream. These can next line for the shell using following example.
be used in pipelines, often in shell scripts. Some filters can $ prog file1 file2 | filter1 | \ tee -a record | filter2
receive data directly from a file. It appends to the file record if a FILE is -, copy again to
Typical usage: standard output.
prog file1 file2 | filter1 | filter2 The tr command (Translation)
cat file1 file2 | filter1 | filter2 This filter can be used to translate specified characters, which
filter1 <file1 | filter2 can read from stdin into other characters which is in stdout. It
we can often redirected an output of final filter to a file. can translate, squeeze, and/or delete characters from standard
Some filters can have file arguments: input, writing to standard output.
$sort file1 file2 | filter1 | filter2 Syntax:
Here we list some of the filters and their syntax and usage. tr [OPTION]... SET1 [SET2]
These can be illustrated in a table. Translates any character read from stdin that is in set1
Filter Action into the corresponding character from set2; otherwise the
more Passes all data from input to output,with character is simply passed through. Here, set1 and set2
pauses at the end of each screen of data. normally have the same number of characters.
cat Passes all data from input to output.
cmp Compare two files SETs are specified as strings of characters. Most represent
comm Identifies common lines in two files themselves. Interpreted sequences are:
cut Passes only specified coloumns -t may be used only when translating. character as
diff Identifies differences between two files or necessary.
between comman files in two directories.
head Passes the number of specified lines at the
beginning of the data
paste Combines coloumns
sort Arranges the data in sequence
tail Passes the number of specified lines at the
end of the data
Tr Translates one or more characters as
specified.
uniq Deletes duplicate lines
wc Counts characters,words, or lines
grep Passes only specified lines
sed Passes edited lines
awk Pauses edited lines –parses lines

Examples:
1. tr abc X5Z // translates a into X, b into 5
and c into Z.
It can have a range of characters using special
(limited)
2. tr can be used with [-] characters as follows

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 8


Unit -1
Linux Programming
tr [a-z] [A-Z] grep '^function' file1.c file2.c
Unfortunately [ ] causes wildcard treatment for a filename to It displays file name and lines with function at
the shells, which is not intended here.We must quote the beginning of line.
arguments; but the quotes are not passed by the shell. 1) To find processes run by user fred in BSD and
$tr '[a-z]' '[A-Z]' SystemV respectively:
It translates lower-case letters to upper case and shell is ps aux | grep fred | grep -v grep
passed tr [a-z] [A-Z]. so, it is safe to use single quotes. We ps -ef | grep fred | grep -v grep
can use *, meaning times number required, if translating to 2) To edit C source files containing Fred:
the same character, 0 in the following: vi `grep -l fred *.c`
$ tr '[a-z]' '[0*]' There are also fgrep (fast grep with no regular expressions)
We can delete characters by making use of option -d , all and egrep (extended regular expressions).
alphabetic characters will be deleted by Head:
$tr -d '[a-zA-Z]' Head is filter, which is used to extract only the first lines from
The complement can be specified by -c and multiple a pipeline or from files, often when prototyping.
replacements can be squashed into a single replacement with Syntax:
–s option and the control characters can be specified as octal head –n input-file
numbers of the form \012 as fallows: it reads first 10 lines as default if we doesn’t
$tr '[A-Z]' '[a-z]' | tr -cs '[a-z]''[\012*]' specify the n value.It will display first n lines from the input
can be used to get all words in lower case on a separate line file if we specify n value.
(i.e separated by a newline or \012 character) while deleting pari@dishen:~$ head -10 sum.pl
any other character(!:-) above command displays 1st 10 lines of a program
grep: tail:
It stands for get regular expression pattern. grep searches for The tail command also displays data in lines through to the
regular expression patterns in lines and print the lines which end of a file. This is used to look at the end of a file and
containing that pattern. As a filter is used to select lines from display the last n lines of the specified file.
stdin that contain search pattern and copy to stdout; other Syntax: $ tail [options] input-file [FILE]...
lines are discarded: $ tail –n input-file
Syntax: It reads last 10 lines as default if we do not specify
$ grep [options] pattern filename the n value. It will display last n lines from the input file if we
returns 0 for success, 1 for failure, 2 specify n value.
for bad pattern. Eg1. It also works as filter.For example to start from line
Options Description 2000 and use head to stop after next 1000
-i ignore case $prog file | tail +2000 | head -1000|filter
-l list file names only Eg2. In order to get the output last 2 lines of file1 in reverse
-c count only of occurrences order by
-n number lines $tail -2r file1
-v output lines that don't match pattern Eg3. It also works as filter.For example to start from line
-e pattern can start with - or multiple 2000 and use head to stop after next 1000
patterns $prog file | tail +2000 | head -1000|filter
Eg4. In order to get the output last 2 lines of file1 in reverse
1) grep found // displays lines containing found order by
pattern $tail -2r file1
2) grep '[Pp]hrase of interest' Sort:
It displays lines containing phrase of interest or Phrase of Sort can be used for sorting the contents of a file.
interest. Here, quotes needed for the shell so spaces and [] are $sort shortlist
passed on as a single argument to grep. Sorting starts with the first character of each line and
3) As program reading its file arguments is used to proceeds to the next character only when the characters in two
find files and/or show lines containing search lines are identical.
pattern: Sort options:
With –t option sorts a file based on the fields.

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 9


Unit -1
Linux Programming
$sort –t “|” +2 shortlist
The sort order can be reversed with –r option.
Sorting on secondary key:
U can sort on more than one field i.e. u can provide a
secondary key to sort.
If the primary key is the third field and the secondary key the
second field, we can use
$sort –t \| +2 -3 +1 shortlist
• Numeric sort (-n):
• To sort on number field use sort with –n option.
• $sort –t: +2 -3 –n group1
• Removing duplicate lines (-u):
• The –u option u purge duplicate lines from a file.
Keys:
Sort key defines set fields, by default all fields. Sort keys are
applied in the order they are defined.By default there is one
sort key that is the entire line. Keys are defined using -k
Sort by Fields startPos,endPos
In general, field is the smallest unit of data that has meaning Eg:
in describing information. For example, a student file contain $ sort -k 2,3 -k 5,7 -k 1,1
name, address, major and other fields that contain data about It would sort on fields 2 and 3, then fields 5, 6 and 7, and then
a student. Sort defines a field as set of characters delimited by field 1 and finally on the rest of the fields in an input file.
a single blank or a tab. The first field of line is delimited by uniq
the space or tab after it. This command can be used to delete duplicate lines in a file,
keeping the first and deleting the others. To be de, the lines
must be adjacent. Duplicate lines that are not adjacent are not
deleted. To delete nonadjacent lines, the file must be sorted.
Syntax
$ uniq [options] [INPUT [OUTPUT]]
If there is no option, one copy of each unique line is printed
Field specifier:
on to the standard output. Duplicated lines are discarded.
When a field sort is required, we need to define which field
Discard all but one of successive identical lines from
or fields are to be used for the sort. Field specifiers are set of
INPUT (or standard input), writing to OUTPUT (or standard
two numbers that together identify the first and last field in a
output).
sort key. They have the fallowing format:
+number1 - number2
Number1 specifies the number of fields to be skipped to get
the beginning of the sort field, where as number2 specifies
the number of fields to be skipped, relative to the beginning
of the line, to get the end of sort key.
+0-1 : first one field will be selected.
+0-2 : first two fields will be selected.
+2-5 : third to fifth field will be selected.
+3 : from fourth field to end of the line will be selected.
Eg:
$sort shortlist
• Sorting starts with the first character of each line
and proceeds to the next character only when the
characters in two lines are identical.
• Sort options:
• With –t option sorts a file based on the fields.

more - file perusal filter for crt viewing

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 10


Unit -1
Linux Programming
cat: More is a filter for paging through text one screenful at a
cat is used to create the files. time. This version is especially primitive. Users should
• $cat> filename realize that less(1) provides more(1) emulation and extensive
Type some text here enhancements.
Press ctrl+d in $ prompt Syntax
Cat can also be used to display the contents more [-dlfpcsu] [-num] [+/ pattern] [+ linenum] [file ...]
Of a file. Text processing utilities: are cat, tail, head, sort, nl, uniq,
$cat filename grep, cut, paste, join, cpmm, pg, cmp, diff, tr.
• Cat can also concatenate the contents of 2 files and diff command:
store them in third file.       This command is used to display file differences. It also
• Cat>file1 file2>new file tells you which lines in one file have to be changed to make
• To append the contents of two files into another file two files identical.
use Syntax:  $ diff file1 file2
• Cat>file1 file2>>new file          diff  uses certain special symbols and instructions to
nl: indicate the changes that  are required to make two files
• nl is used for numbering lines of a file. identical. Each instruction uses an address combined with an
• Nl numbers only logical lines –those containing action that is applied to the first file.
something other apart from the new line character. The instruction
• $nl file 1.   7a8  means appending line after line 7, which become
• nl uses a tab as a default delimiter, but we can line8 in the second file.
change it with –s option.       2.   3c3 change line 3 which remains 3 line after the
• $nl –s: file change.
• nl won’t number a line if it contains nothing.       3.   5,7c5,7 changes 3 lines.
Cut: slitting the file vertically Pg command:
You can slice a file vertically with cut command.         pg is a terminal pager program on Unix for viewing text
• Cutting columns(-c): files. It can also be used to page through the output of a
Cut with –c option cuts the columns. command via a pipe. pg uses an interface similar to vi.
To extract first 4 columns of the group file : Syntax: $pg filename
$cut –c 1-4 group1
The specification –c 1-4 cuts columns 1 to 4.
Cutting fields: Backup utilities:
To cut 1st and 3rd fields use tar: the tape archive program
$cut –d: -f1,3 group1 • Tar doesn’t normally write to the standard output but
Paste: pasting files creates an archive in the media.
• What u cut with the cut can be pasted back with • Tar accepts file and directory names as arguments.
paste command-but vertically rather than • It operates recursively.
horizontally. u can view two files side by side by • It can create several versions of same file in a single
pasting them. archive.
• To join two files calc.lst and result.lst use • It can append to an archive without overwriting the
• $paste –d= calc.lst result.lst entire archive.
Join: • -c option is used to copy files to backup device.
• is a command in Unix-like operating systems that • $tar –cvf /dev/rdsk/foq18dt /home/sales/sql/*.sql
merges the lines of two sorted text files based on the • The verbose option (-v) shows the no. of blocks used
presence of a common field. by each file.
• The join command takes as input two text files and a • Files are restored with the –x (extract) key option.
number of options. If no command-line argument is when no file or directory name is specified it restores
given, this command looks for a pair of lines from all files from the backup device.
the two files having the same first field (a sequence   $tar – cvf archive.tar  f1 f2 f3   //It creates an
of characters that are different from space), and archive which combine these 3 files                
outputs a line composed of the first field followed           $tar –xvf archive.tar   //To display all files in
by the rest of the two lines. the archive
$join file1 file2           $tar –tvf archive.tar  //properties of all files

Cpio: copy input-output


• Cpio command copies files to and from a backup

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 11


Unit -1
Linux Programming
$ awk –F” “ ‘ /mca/ { print }’ file1 //displays the lines those
• device. It uses standard input to take the list of contain the data ‘mca’
filenames.
• It then copies them with their contents and headers $ awk –F” “ ‘NR==3,NR==6 { print NR,$2,$3}’ file1
into stream which can be redirected to a file or a //displays line number, 2nd and 3rd field in the 3rd and 6th line
device.
• Cpio can be used with redirection and piping. Printf: for display formatted output
• Cpio uses two options-o (output) and –i (input)
either of which must be there in the command line. $awk –F” “ ‘NR==3 {
The fgrep and egrep command: > printf “%3d %20s \n”,NR,$1’ file1 //displays line number
and 1st field value.
The fgrep and egrep command are advanced pattern matching
command. The fgrep command doesn’t use any meta $awk –F” “ ‘NR==3 {
character for its searched pattern. The primary advantage of > printf “%3d %20s \n”,NR,$1’ file1 > file2 //output is stored
fgrep is it can also serch two or more than two strings in file2
simultaneously. The fgrep command can be used like this:
$fgrep ‘good Comparison Operators: <, <=, ==, !=, >=, >, ~ - matches a
 bad regular expression, !~ - doesn’t match a regular expression.
 great’ userfile
Here a single quote is used to mark three strings as one             $ awk –F” “ ‘$3==”director” || $3==”chairman”
argument. Here we are going to search three different strings { print }’ file1
good, bad, and great. The egrep command is used to search
this in a more compact form than fgrep command: Number Processing: +, -, *, / and %
               $egrep ‘good | bad | great’ userfile
The egrep uses an or ( | ) operator to achieve this. Therefore             $ awk –F” “ ‘$4==”sales” {
egrep command is more compact and more versatile than             > printf “%20s %10d %8.2f \n”,$2,$3,$3/11}’ file1
fgrep. Another achievement of egrep is that we can make
groups of different patterns. If we use | as operator. Variables:
               $egrep ‘sunil | rohan gavasker ‘ players            
Here sunil is first pattern and everything to the right is             $ awk –F” “ ‘$3>100 {
considered as second pattern.             > count = count + 1
If we want to search both sunil gavasker and rohan gavasker             > printf “%d \n”,count}’ file1
use this:
               $egrep ‘(sunil | rohan)‘ players Supports count++, count += 2 and ++count
$egrep -f pat.lst file1 –this option is for using files instead of
directly specifying different patterns. Reading the Program form a File:
               $fgrep -f pat.lst file1
Awk command:             $ cat > sample.awk
                        This command made a late entry into the             $2==100 {print $1}
UNIX system in 1977 to augment the tool kit with suitable             Press ctrl + d
report formatting capabilities. The awk name is from authors             $ awk –F” “ –f sample.awk file1
Aho, Weinberger and Kernighan. Text Editors:
vi Editor:
Syntax: • vi is a full screen text editor. It was created by Bill
                        Awk options ‘selection criteria {action}’ Joy.
file(s) • Bram Moolenaor improved it and called it vim (vi
improved).
Examples: • Invoking vi:
• $Vi file name
            $ awk –F” “ ‘$3 > 100 { print }’ file1 (or) • Vi has 3 mode of operation.
            $ awk –F” “ ‘$3 > 100’ file1 (or) • 1.Command mode: In this mode all the keys
$ awk –F” “ ‘$3 > 100 { print $0 }’ file1 // displays line in pressed by the user are interpreted as commands. It
file1 whose 3rd field value is greater than 100 may perform some actions like move cursor, save,
delete text, quit vi, etc.
$ awk –F” “ ‘$3 > 100 { print $1,$3 }’ file1 //displays 1st and 2.Input/Insert mode: used for inserting text.
3rd field in the lines whose 3rdfield values is greater than 100 – start by typing i; finish with ESC

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 12


Unit -1
Linux Programming
Once of the most common uses for sed is to substitute one
string for another, using a regular expression:
• Ex mode or last line mode: stream -> | sed 's/bad/good/g' | stream ->
• Used for giving commands at command line. This will substitute good for bad in every occurrence (g is for
• The bottom line of vi is called the command line. global) in the data stream.
Basic Cursor Movements The instruction to sed is s, for substitute. The / is to be used as
h move cursor one place to left a separator, and is just the character that follows the
j down one instruction - it can be any single character (except newline)
k up one that cannot be found in the instruction. Both / and + are
l right one common; these do the same thing:
w move forward one word stream -> | sed 's+bad+good+g' | stream ->
b back one word stream -> | sed 'sXbadXgoodXg' | stream ->
Finishing a vi Session: stream -> | sed 's@bad@good@g' | stream ->
• Get to command mode (press ESCs) The first item /bad/ is the regular expression to be searched
ZZ save changes to the file and quit for. The second /good/ is the substitute value. And the trailing
(no RETURN) g causes every /bad/ to be replaced instead of only the first on
:q! quit without saving each line.
(press RETURN) You can combine several operations into a single sed:
:wq! Saves the file & quit. ... | sed 's/bad/good/g; s+red+green+' | ...
Inserting Text: This will substitute good for bad in every occurrence (g is for
• Move to insertion point global) in the data stream and replace the first red per line
• Switch to input mode: i with green. Note well the semicolon.
• Start typing; BACKSPACE or DELETE You can also do the same with the -e:
for deletion sed -e 's/bad/good/g' -e 's+red+green+'
• ESC finish; back in command mode A collection of sed commands is known as a sed script, and is
Deletion: not the same as a bash script:
• Must be in command mode. [Prompt]$ ls -l sedscr
x Delete character that cursor is on. -rw-rw-r--. 1 allisor allisor 26 (date) sedscr
dd Delete current line. [Prompt]$ cat sedscr
D Delete from cursor position to s/bad/good/g
end of line s+red+green+
u Undo last command [Prompt]$ echo bad red bad red | sed -f ./sedscr
Sed: good green good red
sed is a stream editor. A stream editor is used to perform sed command format
basic text transformations on an input stream (a file or input Commands for sed are in one of three forms:
from a pipeline). While in some ways similar to an editor 1. with an optional multi-line address:
which permits scripted edits (such as ed), sed works by [address]command
making only one pass over the input(s), and is consequently 2. with an optional single line address:
more efficient. But it is sed’s ability to filter text in a pipeline [line address]command
which particularly distinguishes it from other types of editors. 3. as a group with a required address:
sed is called the stream editor since it works from stdin address {
or from input files, and writes its output to stdout. That is, it command1
works on a stream of data through stdin/stdout. command2
Therefore: command3
sed - stream editor for filtering and transforming text ...
sed [OPTION]... [{sed script}] [input-file]... }
The options need not be used. Some of the common ones The third form is usually only found in sed scripts.
include: Addresses
-n, --quiet, --silent You can address a single line with a line number. For
suppress automatic printing of pattern space example, to delete the first line of a file:
-e script, --expression=script [Prompt]$ cat file1
add the script to the commands to be executed line1
-f script-file, --file=script-file line2
add the contents of script-file to the commands [Prompt]$ sed '1d' file1
to be executed line2
-r, --regexp-extended [Prompt]$ sed 'd' file1 # note this!

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 13


Unit -1
Linux Programming
use extended regular expressions in the script. (character)
[Prompt]$ (character)\{m,\} Match m or more repetitions of (character)
The delete command is d, and the first example uses the (character)\{,n\} Match n or less (possibly 0) repetitions of
address 1 with it to delete the first line. Note especially the (character)
second example, where the default is all lines! (character)\{n\} Match exactly n repetitions of (character)
A range of lines use comma (file2 has 3 lines): \{n,m\} range of occurrences, n and m are integers
[Prompt]$ sed '1,2d' file2 \(expression\) Group operator.
line3 expression1|expression2 Matches expression1 or expression
You can also address lines with a regex, which must be 2.
enclosed in forward slashes /.../: () groups regular expressions
[Prompt]$ cat file2
first line Regular Expressions (character classes):
second line The following character classes are short-hand for matching
line3 special characters.
[Prompt]$ sed '/^line/d' file2 [:alnum:] Printable characters (includes
first line white space)
second line [:alpha:] Alphabetic characters
You can also combine line numbers and a regex as in [:blank:] Space and tab characters
'1,/^$/d' where all lines from the start to the first empty line [:cntrl:] Control characters
will be deleted. [:digit:] Numeric characters
The last line of the file can be represented by $ so that 2,$ [:graph:] Printable and visible (non-space)
would delete all but the first line: characters
[Prompt]$ sed '2,$d' file2 [:lower:] Lowercase characters
first line [:print:] Alphanumeric characters
Substitute command [:punct:] Punctuation characters
[address]s!regex!replacement!flags [:space:] Whitespace characters
[:upper:] Uppercase characters
[:xdigit:] Hexadecimal digits
/^M.*/ : Line begins with capital M, 0 or more chars follow

/..*/ : At least 1 character long (/.+/ means the same thing)

/^$/ : The empty line

ab|cd : Either ‘ab’ or ‘cd’


You've already seen the global flag, g. The others are n, a
number, requesting that the nth instance of the regex be [[:space:][:alnum:]] : Matches any character that is either a
replaced (default 1), p to print (for example, if using -n), and white space character or alphanumeric.
w file to write to the named file in addition to stdout (unless
a(b*|c*)d : matches any string beginning with a letter a,
-n).
followed by either zero or more of the letter b, or zero or
Regular Expressions:
more of the letter c, followed by the letter d.
Sed uses regular expressions to match patterns in the input
text, and then Line Addresses:
perform operations on those patterns. Each line read is counted, and one can use this information to
^ matches the beginning of the line absolutely select which lines commands should be applied to.
$ matches the end of the line 1 first line
. Matches any single character 2 second line
\ Escapes any metacharacter that ...
follows, including itself. $ last line
(character)* Match arbitrarily many occurences of i,j from i-th to j-th line, inclusive. j can be $
(character) Examples :
(character)? Match 0 or 1 instance of (character) sed ’53!d’
(character)+ Match 1 or more instances of (character) prints through line number 52
[abcdef] Match any character enclosed in [ ] (in this sed –n ‘4,9p’
instance, a b c d e or f) prints only lines 4 through 9
[^abcdef] Match any character NOT enclosed in [ ]

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 14


Unit -1
Linux Programming
(character)\{m,n\} Match m-n repetitions of Context Addresses:
The second kind of addresses are context, or Regular
Examples: Expression, addresses. Commands will be executed on all
sed ‘/^$/d’ will delete all empty lines pattern spaces matched by that RE.
sed ‘/./,$!d’ will delete all leading blank lines Range Addresses An address range can be specified by
at the top of file specifying two addresses separated by a comma (,). An
address range matches lines starting from where the first
Some Rules: address matches, and continues until the second address
 commands may take 0, 1 or 2 addresses matches (inclusively):
 if no address is given, a command is applied to all $ seq 10 | sed -n ’4,6p’
pattern spaces 4
 if 1 address is given, then it is applied to all pattern 5
spaces that match that address 6
 if 2 addresses are given, then it is applied to all If the second address is a regexp, then checking for the ending
formed pattern spaces between the pattern space that match will start with the line following the line which
matched the first address, and the next pattern space matched the first address: a range will always span at least
matched by the second address. two lines (except of course if the input stream ends).
 If pattern spaces are all the time single lines, this can $ seq 10 | sed -n ’4,/[0-9]/p’
be said like, if 2 addrs are given, then the command 4
will be executed on all lines between first addr and 5
second (inclusive) If the second address is a number less than (or equal to) the
Commands line matching the first address, then only the one line is
No address matched: $ seq 10 | sed -n ’4,1p’
# comment to the 4
end of the line, quote, or -e Delete Lines
We will go over the only some basic sed commands.
a append 1. Delete lines that containing both BEGIN and END.
c change lines
d delete lines $ sed ‘/BEGIN.*END/d’ input.dat
i insert
p print lines
s substitute 2. Delete lines that contain BEGIN but not END.

$sed -f beginOnly.sed input.dat


Both append and insert require that you escape all embedded
newline characters. #Delete lines that contain BEGIN not END
r filename append text read from filename #beginOnly.sed
selecting lines by text matching: /BEGIN/ {
The following command prints lines in /etc/passwd which /END/!d
end with ‘bash’ 5 : }
sed -n ’/bash$/p’ /etc/passwd 3. Delete a block that starts with a line containing BEGIN
The empty regular expression ‘//’ repeats the last regular and ends with a line containing END.
expression match (the same holds if the empty regular
expression is passed to the s command). Note that modifiers $sed ‘/BEGIN/,/END/d’ input.dat
to regular expressions are evaluated when the regular
expression is compiled, thus it is invalid to specify them Delete Text
together with the empty regular expression.
In the following example, automatic printing is disabled with 1. Delete the text string in one line that starts with
-n. The s/2/X/ command changes lines containing ‘2’ to ‘X’. BEGIN and ends with END(inclusive).
The command /[0-9]/p matches lines with digits and prints
them. Because the second line is changed before the /[0-9]/ $sed ‘s/BEGIN.*END//’ input.dat
regex, it will not match and will not be printed: Chapter 4:
Addresses: 2. Delete the text between two words, BEGIN and
selecting lines 24 $ seq 3 | sed -n 's/2/X/ ; /[0-9]/p' END. The beginning and ending text can be on one
1 line or can span many lines.
3

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 15


Unit -1
Linux Programming

#beginEnd5.sed $sed –f beginEnd5.sed input.dat


#Delete the text between two words BEGIN and END
/BEGIN.*END/s///
/BEGIN/,/END/{ • Flexible printing
/BEGIN/ { • Built-in arithmetic and string functions
h • C-like syntax
d Structure of an AWK Program:
} • An awk program consists of:
/END/! { – An optional BEGIN segment
H • For processing to execute prior to
d reading input
} – pattern - action pairs
/END/{ • Processing for input data
x • For each pattern matched, the
G corresponding action is taken
} – An optional END segment
s/BEGIN.*END// • Processing after end of input data
} BEGIN {action}
AWK: Stands for pattern {action}
pattern {action}
.
.
.
pattern { action}
END {action}
Running an AWK Program:
• There are several ways to run an Awk program
– awk 'program' input_file(s)
• program and input files are
• awk's purpose: A general purpose programmable provided as command-line
filter that handles text (strings) as easily as numbers arguments
– This makes awk one of the most powerful – awk 'program'
of the Unix utilities • program is a command-line
• awk processes fields while sed only processes lines argument; input is taken from
• nawk (new awk) is the new standard for awk standard input (yes, awk is a
– Designed to facilitate large awk programs filter!)
– gawk is a free nawk clone from GNU – awk -f program_file input_files
• awk gets its input from • program is read from a file
– files
– redirection and pipes Patterns and Actions:
– directly from standard input • Search a set of files for patterns.
• A programming language for handling common data • Perform specified actions upon lines or fields that
manipulation tasks with only a few lines of code contain instances of patterns.
• awk is a pattern-action language, like sed • Does not alter input files.
• The language looks a little like C but automatically • Process one input line at a time
handles input, field splitting, initialization, and • This is similar to sed
memory management Pattern-Action Structure:
– Built-in string and number data types • Every program statement has to have a pattern or an
– No variable type declarations action or both
• awk is a great prototyping language • Default pattern is to match all lines
– Start with a few lines and keep adding until • Default action is to print current record
it does what you want • Patterns are simply listed; actions are enclosed in { }
Awk Features over Sed: • awk scans a sequence of input lines, or records, one
• Convenient numeric processing by one, searching for lines that match the pattern
– Meaning of match depends on the pattern

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 16


Unit -1
Linux Programming
• Variables and control flow in the actions
• Convenient way of accessing fields within lines
Patterns:
• Selector that determines whether action is to be
executed Records:
• pattern can be: • Default record separator is newline
– the special token BEGIN or END – By default, awk processes its input a line at
– regular expression (enclosed with //) a time.
– relational or string match expression • Could be any other regular expression.
– ! negates the match • RS: record separator
– arbitrary combination of the above using – Can be changed in BEGIN action
&& ||
• /NYU/ matches if the string • NR is the variable whose value is the number of the
“NYU” is in the record current record.
• x > 0 matches if the condition is Fields:
true • Each input line is split into fields.
• /NYU/ && (name == "UNIX – FS: field separator: default is whitespace (1
Tools") or more spaces or tabs)
BEGIN and END patterns: – awk -Fc option sets FS to the character c
• BEGIN and END provide a way to gain control • Can also be changed in BEGIN
before and after processing, for initialization and – $0 is the entire line
wrap-up. – $1 is the first field, $2 is the second field,
– BEGIN: actions are performed before the ….
first input line is read. • Only fields begin with $, variables are unadorned
– END: actions are done after the last input Simple Output From AWK:
line has been processed. • Printing Every Line
Actions: – If an action has no pattern, the action is
• action may include a list of one or more C like performed to all input lines
statements, as well as arithmetic and string • { print } will print all input lines to
expressions and assignments and multiple output standard out
streams. • { print $0 } will do the same thing
• action is performed on every line that matches • Printing Certain Fields
pattern. – Multiple items can be printed on the same
– If pattern is not provided, action is output line with a single print statement
performed on every input line – { print $1, $3 }
– If action is not provided, all matching lines – Expressions separated by a comma are, by
are sent to standard output. default, separated by a single space when
• Since patterns and actions are optional, actions must printed (OFS)
be enclosed in braces to distinguish them from
pattern. • Printing Every Line
Example: – If an action has no pattern, the action is
ls | awk ' performed to all input lines
BEGIN { print "List of html files:" } • { print } will print all input lines to
/\.html$/ { print } standard out
END { print "There you go!" } • { print $0 } will do the same thing
' • Printing Certain Fields
List of html files: – Multiple items can be printed on the same
index.html output line with a single print statement
as1.html – { print $1, $3 }
as2.html – Expressions separated by a comma are, by
There you go! default, separated by a single space when
Variables: printed (OFS)
• awk scripts can define and use variables • NF, the Number of Fields
BEGIN { sum = 0 } – Any valid expression can be used after a $
{ sum ++ } to indicate the contents of a particular field
END { print sum }

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 17


Unit -1
Linux Programming
• Some variables are predefined – One built-in expression is NF, or Number
of Fields
– { print $(NF-2) } prints the third to last – { print NF, $1, $NF } will print the number
field of fields, the first field, and the last field in
• Computing and Printing the current record
– You can also do computations on the field • as numbers, and Awk conveniently translates back
values and include the results in your and forth as needed
output • This program finds the employee who is paid the
– { print $1, $2 * $3 } most per hour:
• Printing Line Numbers # Fields: employee, payrate
– The built-in variable NR can be used to $2 > maxrate { maxrate = $2; maxemp = $1 }
print line numbers END { print “highest hourly rate:”,
– { print NR, $0 } will print each line maxrate, “for”, maxemp }
prefixed with its line number String Manipulation:
• Putting Text in the Output • String Concatenation
– You can also add other text to the output – New strings can be created by combining
besides what is in the current record old ones
– { print "total pay for", $1, "is", $2 * $3 } { names = names $1 " " }
– Note that the inserted text needs to be END { print names }
surrounded by double quotes • Printing the Last Input Line
Selection: – Although NR retains its value after the last
• Awk patterns are good for selecting specific lines input line has been read, $0 does not
from the input for further processing { last = $0 }
– Selection by Comparison END { print last }
• $2 >= 5 { print } Built-in Functions:
– Selection by Computation • awk contains a number of built-in functions. length
• $2 * $3 > 50 { printf(“%6.2f for is one of them.
%s\n”, • Counting Lines, Words, and Characters using length
$2 * $3, $1) } (a poor man’s wc)
– Selection by Text Content { nc = nc + length($0) + 1
• $1 == "NYU" nw = nw + NF
• $2 ~ /NYU/ }
– Combinations of Patterns END { print NR, "lines,", nw, "words,", nc,
• $2 >= 4 || $3 >= 20 "characters" }
– Selection by Line Number • substr(s, m, n) produces the substring of s that
• NR >= 10 && NR <= 20 begins at position m and is at most n characters long.
Arithmetic and variables: Control Flow Statements:
• awk variables take on numeric (floating point) or • awk provides several control flow statements for
string values according to context. making decisions and writing loops
• User-defined variables are unadorned (they need not • If-Then-Else
be declared). $2 > 6 { n = n + 1; pay = pay + $2 * $3 }
• By default, user-defined variables are initialized to END { if (n > 0)
the null string which has numerical value 0. print n, "employees, total pay is",
Computing with AWK: pay, "average pay is", pay/n
• Counting is easy to do with Awk else
$3 > 15 { emp = emp + 1} print "no employees are paid more
END { print emp, “employees worked than $6/hour"
more than 15 hrs”} }
• Computing Sums and Averages is also simple Loop Control:
{ pay = pay + $2 * $3 } • While
END { print NR, “employees” # interest1 - compute compound interest
print “total pay is”, pay # input: amount, rate, years
print “average pay is”, pay/NR # output: compound value at end of each year
} { i=1
Handling Text: while (i <= $3) {
• One major advantage of Awk is its ability to handle printf(“\t%.2f\n”, $1 * (1 + $2) ^

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 18


Unit -1
Linux Programming
strings as easily as many languages handle numbers i)
• Awk variables can hold strings of characters as well i=i+1
Do-While Loops: }
• Do While }
do {
statement1 Even More One-liners:
} • { for (i = NF; i > 0; i = i - 1)
while (expression) printf(“%s “, $i)
For statements: printf(“\n”)
• For }
# interest2 - compute compound interest • { sum = 0
# input: amount, rate, years for (i = 1; i <= NF; i = i + 1)
# output: compound value at end of each year sum = sum + $i
{ for (i = 1; i <= $3; i = i + 1) print sum
printf("\t%.2f\n", $1 * (1 + $2) ^ i) }
} • { for (i = 1; i <= NF; i = i + 1)
Arrays: sum = sum $i }
• Array elements are not declared END { print sum }
• Array subscripts can have any value: }
– Numbers Awk Variables:
– Strings! (associative arrays) • $0, $1, $2, $NF
• Examples • NR - Number of records processed
– arr[3]="value" • NF - Number of fields in current record
– grade["Korn"]=40.3 • FILENAME - name of current input file
Array Example: • FS - Field separator, space or TAB by default
# reverse - print input in reverse order by line • OFS - Output field separator, space by default
{ line[NR] = $0 } # remember each line • ARGC/ARGV - Argument Count, Argument Value
array
END { – Used to get arguments from the command
for (i=NR; (i > 0); i=i-1) { line
print line[i] Operators:
}} • = assignment operator; sets a variable equal to a
Use for loop to read associative array value or string
– for (v in array) { … } • == equality operator; returns TRUE is both sides are
– Assigns to v each subscript of array equal
(unordered) • != inverse equality operator
– Element is array[v] • && logical AND
Useful One (or so)-liners: • || logical OR
• END { print NR } • ! logical NOT
• NR == 10 • <, >, <=, >= relational operators
• { print $NF } • +, -, /, *, %, ^
• { field = $NF } • String concatenation
END { print field } Built-In Functions:
• NF > 4 • Arithmetic
• $NF > 4 – sin, cos, atan, exp, int, log, rand, sqrt
• { nf = nf + NF } • String
END { print nf } – length, substr, split
More One-liners: • Output
• /Jeff/ { nlines = nlines + 1 } – print, printf
END { print nlines } • Special
• $1 > max { max = $1; maxline = $0 } – system - executes a Unix command
END { print max, maxline } • system(“clear”) to clear the screen
• NF > 0 • Note double quotes around the
• length($0) > 80 Unix command
• { print NF, $0} – exit - stop reading input and go
• { print $2, $1 } immediately to the END pattern-action pair

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 19


Unit -1
Linux Programming
• { temp = $1; $1 = $2; $2 = temp; print } if it exists, otherwise exit the script
• { $2 = ""; print }
AWK Example Programs:
1.Write an awk command to print the lines and line
numberin the given input file .
SOLUTION: STEP2: create marks.awk script file
STEP1: create input file
$ cat > fileinput $cat > marks.awk
welcome to
jntuamca {
hello
STEP2: create awk script total=$3+$4+$5+$6+$7
$ cat > cmds.awk
{print NR, $0 } if($3>=30 && $4>=30 && $5>=40 && $6>=40 &&
STEP3: execute awk program $7>=40)
$ awk -f cmds.awk fileinput
1 welcome to print $1,$2,total, "Pass";
2 jntuamca
3 hello else
$
2.Write an awk command to print first field and second print $1,$2,total, "fail";
field only if third field value is >=50 in the given input
file. (input field separator is “:” and output field }
separator is “,”) STEP3: execute awk program
SOLUTION:
STEP1: create input data file $awk -F “,” -f marks.awk marks.txt
$cat > file1 4 Write an awk program to print the fields 1 and 4 of a
sachin:10:100 file that is passed as command line argument. The file
rahul:11:95 contains lines of information that is separated by “,” as
rohit:12:89 delimeter. The awk program must print at the end the
STEP2: execute awk program average of all 4th field data.
$awk -F':' '$3>=50 {print $1”,”$2}' file1 SOLUTION:
sachin,10 STEP1: create data file
rahul,11 $cat > data
rohit,12 12,13,14,15,one
3 . Consider the marks.txt is a file that contains one 22,23,24,25,two
record per line( comma separate fields) of the student 34,23,45,23,three
data in the form of studentid, student name, Telugu 44,55,66,77,four
marks, English marks, Maths Marks, Science marks, ^d
Social Marks. Write an awk script to generate result for
every students in the form of studentid, studentname, STEP2: Execute awk program
Total Marks and result. Result is PASS if marks is >=30 $awk -F',' '{print $1,$2,$3,$4,($1+$2+$3+$4)/4}' data
in TELUGU and English, and if marks>=40 in other 5.Write an awk program to demonstrate user defined
subjects. Result is fail otherwise. functions and system command.
SOLUTION:
SOLUTION: STEP1: create data file
$cat > data
STEP1: create marks.txt file 12,13,14,15,one
22,23,24,25,two
$cat > marks.txt 34,23,45,23,three
1001,name1,99,69,85,56,75 44,55,66,77,four
1002,name2,89,69,65,56,55 ^d
1003,name3,50,50,50,55,55 STEP2: Create user.awk script file
1004,name4,69,29,85,56,75 $cat >user.awk
1005,name5,99,69,85,56,11 {
^d if($3>0)

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 20


Unit -1
Linux Programming
display($3)
}
STEP3: execute awk program function display(name)
$awk -F',' -f user.awk data { print name
6. Write an awk script to count the number of lines in a } ^d
file that do not contain vowels.
SOLUTION:
STEP1: Create a file called input
$cat > input
this is one
213
BCDEFG Shell:
This is last line  It’s acts an interface between the user and OS
^d (kernel).It’s known as “ command interpreter”.
STEP2: Create vowels.awk script file When you type ls :
$cat vowels.awk shell finds cmd (/usr/bin).
BEGIN{count=0} shell runs cmd.
!/[aeiou]/ {count++;print} you receive the output.
END{print "Number of lines="count}  It’s collections of executables or commands placed
^d in a file and executed.
STEP3: execute awk program  It provides user an option to execute a command
$awk -f vowels.awk input based on some condition.
7. Write an awk script to find the number of characters,  It provides conditional and control statements.
words and lines in a file. (if,for,while,switch-case etc )
SOLUTION: Basic Shell Programming:
STEP1: Create a file called file7 • A script is a file that contains shell commands
$cat > file7 – data structure: variables
This is a file – control structure: sequence, decision, loop
YEs NO • line for bash shell script:
1234 #! /bin/bash
^d #! /bin/sh
STEP2: Create awk script file • to run:
$cat > lines.awk – make executable: % chmod +x script
BEGIN{words=0;characters=0} – invoke via: % ./script
{
character+=length($0);
words+=NF;
}
END{print "lines=",NR," words=",words,"
Characters=",character}
^d
STEP3: execute awk program
$awk -f lines.awk file7
Shell Programming: To execute a shell script we use the following command
Installing Shell Script: sh filename.sh
• Type the following command in terminal (or)
• sudo apt-get update && sudo apt-get install bash bash ./filename.sh
#!/bin/bash // bash#!/bin/bash command is necessary to work Bash shell programming:
with shell programs  Input
echo "hello, $USER. I wish to list some files of yours"  prompting user
echo "listing files in the current directory, $PWD" ls # list  command line arguments
files  Decision:
 if-then-else
 case
 Repetition

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 21


Unit -1
Linux Programming
 do-while, repeat-until
 for
User input:  select
• shell allows to prompt for user input  Functions
Syntax:  Traps
read varname [more vars]
• or if statement:
read –p "prompt" varname [more vars] if command
• words entered by user are assigned to then
varname and “more vars” statements
• last variable gets rest of input line fi
User input example: • statements are executed only if command succeeds,
#! /bin/sh i.e. has return status “0”
read -p "enter your name: " first last test command:
echo "First name: $first" Syntax:
echo "Last name: $last" test expression
Special shell variables: [ expression ]
• evaluates ‘expression’ and returns true or false
Example:
if test –w "$1"
then
echo "file $1 is write-able"
fi
above syntax specifies that 1st argument passed by the user is
a file and is writable it will be executed.
The if-then-else statement:
if [ condition ]; then
statements-1
else
statements-2
Examples: Command Line Arguments fi
The ‘set’ command can be used to assign values to positional  executes statements-1 if condition is true
parameters  executes statements-2 if condition is false
Cat>file1 Else if:
% set tim bill ann fred • The word elif stands for “else if”
$1 $2 $3 $4 • It is part of the if statement and cannot be used by
Ctrl^d itself
% echo $* if [ condition ]; then
tim bill ann fred statements
% echo $# elif [ condition ]; then
4 statement
% echo $1 else
tim statements
% echo $3 $4 fi
ann fred Relational Operators:
bash control structures:
• if-then-else
• case
• loops
– for
– while
– until
– select

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 22


Unit -1
Linux Programming

Compound logical expressions:


and, or
must be enclosed within square braces [[ ]]

Example: Using the ! Operator:


#!/bin/bash
read -p "Enter years of work: " Years
if [ ! "$Years" -lt 20 ]; then
echo "You can retire now."
else
echo "You need 20+ years to retire"
fi
Example: Using the && Operator: Example: File Testing:
#!/bin/bash #!/bin/bash
Bonus=500 echo "Enter a filename: "
read -p "Enter Status: " Status read filename
read -p "Enter Shift: " Shift if [ ! –r "$filename" ]
if [[ "$Status" = "H" && "$Shift" = 3 ]] then
then echo "File is not read-able"
echo "shift $Shift gets \$$Bonus bonus" exit 1
else fi
echo "only hourly workers in" Finding file name readable and writable:
echo "shift 3 get a bonus" #! /bin/bash
fi if [ $# -lt 1 ]; then
Example: Using the || Operator: echo "Usage: filetest filename"
#!/bin/bash exit 1
read -p "Enter calls handled:" CHandle fi
read -p "Enter calls closed: " CClose if [[ ! -f "$1" || ! -r "$1" || ! -w "$1" ]]
if [[ "$CHandle" -gt 150 || "$CClose" -gt 50 ]] then
then echo "File $1 is not accessible"
echo "You are entitled to a bonus" exit 1
else fi
echo "You get a bonus if the calls" # The following THREE if-conditions produce the same
echo "handled exceeds 150 or" result
echo "calls closed exceeds 50" * DOUBLE SQUARE BRACKETS
fi read -p "Do you want to continue?" reply
if [[ $reply = "y" ]]; then
File Testing: echo "You entered " $reply
Symbol Meaning fi
* SINGLE SQUARE BRACKETS
-d file True if ‘file’ is a directory read -p "Do you want to continue?" reply
-f file True if ‘file’ is an ord. file if [ $reply = "y" ]; then
-r file True if ‘file’ is readable echo "You entered " $reply
-w file True if ‘file’ is writable fi
-x file True if ‘file’ is executable * "TEST" COMMAND
-s file True if length of ‘file’ is read -p "Do you want to continue?" reply
nonzero if test $reply = "y"; then
echo "You entered " $reply
fi
The case Statement:
• use the case statement for a decision that is based on

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 23


Unit -1
Linux Programming
multiple choices
Syntax:
Example 1: The case Statement: case word in
#!/bin/bash pattern1) command-list1
echo "Enter Y to see all files including hidden files" ;;
echo "Enter N to see all non-hidden files" pattern2) command-list2
echo "Enter q to quit" ;;
read -p "Enter your choice: " reply patternN) command-listN
case $reply in ;;
Y|YES) echo "Displaying all (really…) files" esac
ls -a ;; case pattern:
N|NO) echo "Display all non-hidden files..." • checked against word for match, may also contain:
ls ;; *
Q) exit 0 ;; ?
*) echo "Invalid choice!"; exit 1 ;; […]
esac [:class:]
• multiple patterns can be listed via: |
The until Loop:
• Purpose:
To execute commands in “command-list” as
long as “expression” evaluates to false
Syntax:
until [ expression ]
do
command-list
done
Example: Using the until Loop
Repetition Constructs: #!/bin/bash
Stop="N"
until [ $Stop = "Y" ]; do
ps -A
read -p "want to stop? (Y/N)" reply
Stop=`echo $reply | tr [:lower:] [:upper:]`
done
echo "done"
The while Loop:
• Purpose:
To execute commands in “command-list” as
long as “expression” evaluates to true
Syntax:
while [ expression ]
do
command-list
done
while Loop:
#!/bin/bash The for Loop:
COUNTER=0 • Purpose:
while [ $COUNTER -lt 10 ] To execute commands as many times as the
do number of words in the “argument-list”
echo The counter is $COUNTER Syntax:
let COUNTER=$COUNTER+1 for variable in argument-list
done do
commands
done
Example : Using the for Loop
#!/bin/bash

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 24


Unit -1
Linux Programming
# compute the average weekly temperature
for num in 1 2 3 4 5 6 7
do
read -p "Enter temp for day $num: " Temp
let TempTotal=$TempTotal+$Temp
done
Select command: let AvgTemp=$TempTotal/7
• Constructs simple menu from word list echo "Average temperature: " $AvgTemp
• Allows user to enter a number instead of a word looping over arguments:
• User enters sequence number corresponding to the simplest form will iterate over all command line arguments:
word
#! /bin/bash
Syntax:
select WORD in LIST for parm
do do
RESPECTIVE-COMMANDS echo $parm
done done
• Loops until end of input, i.e. ^d (or ^c)
Select example:
#! /bin/bash Shell Functions:
select var in alpha beta gamma • A shell function is similar to a shell script
do – stores a series of commands for execution
echo $var later
done – shell stores functions in memory
• Prints: – shell executes a shell function in the same
shell that called it
• Where to define
– In .profile
– In your script
– Or on the command line
• Remove a function
– Use unset built-in
• must be defined before they can be referenced
• usually placed at the beginning of the script
break and continue: Syntax:
• Interrupt for, while or until loop function-name () {
• The break statement statements
– transfer control to the statement AFTER }
the done statement Function parameters:
– terminate execution of the loop • Need not be declared
• The continue statement • Arguments provided via function call are accessible
– transfer control to the statement TO the inside function as $1, $2, $3, …
done statement $# reflects number of parameters
– skip the test statements for the current $0 still contains name of script
iteration (not name of function)
– continues execution of the loop Example 1: function:
Example: #!/bin/bash
for index in 1 2 3 4 5 6 7 8 9 10 fun () { # A somewhat more complex function.
do JUST_A_SECOND=1
if [ $index –le 3 ]; then let i=0
echo "continue" REPEATS=30
continue echo "And now the fun really begins."
fi while [ $i -lt $REPEATS ]
echo $index do
if [ $index –ge 8 ]; then echo "-------FUNCTIONS are fun-------->"
echo "break" sleep $JUST_A_SECOND
break let i+=1

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 25


Unit -1
Linux Programming
fi done
done }
fun
Example 2: function:
#!/bin/bash
hello()
Until Statements: {
• The until structure is very similar to the while echo “You are in function hello()”
structure. The until structure loops until the }
condition is true. So basically it is “until this
condition is true, do this”. echo “Calling function hello()…”
hello
until [expression] echo “You are now out of function hello()”
do
statements
done
$ cat countdown.sh
#!/bin/bash
echo “Enter a number: ”; read x PROGRAM :
echo ; echo Count Down #! /bin/bash
until [ “$x” -le 0 ]; do  # To declare static Array 
echo $x arr=(prakhar ankit 1 rishabh manish abhinav)
x=$(($x –1))  # To print all elements of array
sleep 1 echo ${arr[@]}       
done echo ${arr[*]}       
echo ; echo GO ! echo ${arr[@]:0}    
Array in Shell Scripting: echo ${arr[*]:0} 
An array is a systematic arrangement of the same type of Output:
data. But in Shell script Array is a variable which contains prakhar ankit 1 rishabh manish abhinav
multiple values may be of same type or different type since prakhar ankit 1 rishabh manish abhinav
by default in shell script everything is treated as a string. An prakhar ankit 1 rishabh manish abhinav
array is zero-based ie indexing start with 0. prakhar ankit 1 rishabh manish abhinav
How to Declare Array in Shell Scripting?
We can declare an array in a shell script in different ways. To print particular element
1. Indirect Declaration echo ${arr[3]}        
In Indirect declaration, We assigned a value in a particular echo ${arr[1]}  
index of Array Variable. No need to first declare. To print elements from a particular index
echo ${ARRAYNAME[WHICH_ELEMENT]:
ARRAYNAME[INDEXNR]=value STARTING_INDEX}
2. Explicit Declaration To print elements from a particular index
In Explicit Declaration, First We declare array then assigned echo ${arr[@]:0}
the values. echo ${arr[@]:1}
3. Compound Assignment echo ${arr[@]:2}
In Compount Assignment, We declare array with a bunch of echo ${arr[0]:1}
values. We can add other values later too. Output:

ARRAYNAME=(value1 value2 .... valueN) prakhar ankit 1 rishabh manish abhinav


or ankit 1 rishabh manish abhinav
[indexnumber=]string 1 rishabh manish abhinav
Prakhar
ARRAYNAME=([1]=10 [2]=20 [3]=30)
To Print Array Value in Shell Script? To print elements in range
To Print All elements
[@] & [*] means All elements of Array. echo ${ARRAYNAME[WHICH_ELEMENT]:
echo ${ARRAYNAME[*]} STARTING_INDEX:COUNT_ELEMENT}
filter_none

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 26


Unit -1
Linux Programming
edit
play_arrow

brightness_4
# To print elements in range
echo ${arr[@]:1:4}
#!/bin/sh echo ${arr[@]:2:3}
echo ${arr[0]:1:3}
NAME[0]="Zara" To count length of Array.
NAME[1]="Qadir"
NAME[2]="Mahnaz" # Size of an Array
NAME[3]="Ayan" echo ${#arr[@]}
NAME[4]="Daisy" echo ${#arr[*]}
echo "First Method: ${NAME[*]}" Output:
echo "Second Method: ${NAME[@]}" 6
OUTPUT: 6
$ksh main.ksh
First Method: Zara Qadir Mahnaz Ayan Daisy
Second Method: Zara Qadir Mahnaz Ayan Daisy trap funcname 1 2 15
Shell Script Interrupts
handling shell interrupts is something you should consider. funcname
As a user is interacting with your script, they may decide to # Function to handle interrupts
interrupt it by typing Ctrl-C, for example. Typically this will {
interrupt your shell script execution, forcing it to exit. echo "`basename $0`: Ouch! User Aborted." 1&gt;&amp;2
exit $exit_user_abort
Depending on what your shell script is doing, this could leave }
behind temporary files, or leave other files in a broken state.
It would be useful if you could trap the interrupt, and handle Shell programs:
it safely, before exiting the script. To find the area of a triangle
This can be achieved on most shells using the ‘trap’ $cat tri.sh
command. The trap command takes the following syntax: echo Area of the Triangle
trap [OPTIONS] [[ARG] SIGSPEC ... ] echo Enter the Base
The ARG is the command to be executed on signal delivery, read b
while SIGSPEC is the name of the signal(s) to trap.  Options echo Enter the Height
include -h for help, -l to list signal names, or -p to print all read h
defined signal handlers. echo " scale=2; 0.5 * $b * $h " | bc
For example, to always return a ‘user aborted’ error code, the
following line in your script could be used.  Whatever value OUTPUT:
given to $exit_user_abort would be returned.
trap 'echo "`basename $0`: Ouch! User Aborted." $sh tri.sh
1&gt;&amp;2; exit $exit_user_abort' 1 2 15 Area of the Triangle
The numbers 1, 2 and 15 at the end of this example define Enter the Base
which interrupts we’re interested in trapping.  These numbers 10
correspond to different kinds of interrupts.  A short list is Enter the Height
given here, but you can use ‘trap -l’ for a complete list. 3

15.0

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 27


Unit -1
Linux Programming
If your trap runs several commands, it’s possibly neater to
call a shell function than list the commands in-line, as above. 
For example:

Prepared by G. Pradeep Reddy, Lecturer, Dept. C.S.E, JNTUACEA, Anantapur Page 28

You might also like