Conceptual Framework of Open Source Systems
Conceptual Framework of Open Source Systems
CONCEPTUAL FRAMEWORK
underlying the framework of the study. It includes the conceptual model of the
Introduction
Complicated servers are doing the process to parse what ever people
centralized so that end users, wherever they are get the same information
available for them. Fast access and with less cost to pay is the main reason why
we use the internet as source of information. These servers must be error free
and must be protected from intrusion. Web-host in particular indeed plays a big
role in giving what is true to anyone who would like to get accurate information.
Pricing is the main concern of this study. Maintenance may not be that
expensive for open-source was opened for the public to use. Today, different
companies bought high price security tools and appliances for their networks and
application protection. Web servers are built for production to cater the internet
savvy the information they need. They run different services like APACHE, DNS,
MySQL database, FTP, Mail services and CGI. Hackers are there to interrupt this
6
service and postpone jobs costing downtime and system in dependability. This
criteria, which is practical, economic and of high integrity to address the risks of
company in dealing with such data loss and attacks. With this, great attention on
system security and good performance that may protect not just client's necessity
methodologies to promote open access to the production and design process for
various goods, products and resources. The term is most commonly applied to
the source code of software that is made available to the general public with
of their operations. Before open source became widely adopted, developers and
producers used a variety of phrases to describe the concept; the term open
source gained popularity with the rise of the Internet and its enabling of diverse
Subsequently, open source software became the most prominent face of open
7
source practices. The open source model can allow for the concurrent use of
which fixations are made generally available. Participants in such a culture are
able to modify those products and redistribute them back into the community.
who was when working on a UNIX system that was running on expensive
platform. Because of his low budget, and his need to work at home, he decided
platform, Such as an IBM PC. He began his work in 1991 when he release
version 0.02 and worked steadily until 1994 when version 1.0 of the Linux Kernel
was released.
It has been said that technically speaking, Linux is a kernel, the core part
makes the whole thing run. Most people, however, refer to Linux as the entire
or Apple’s Mac OS. Linux can replace Windows as a desktop operating system,
fledged UNIX, including true, stable multitasking, virtual memory, shared libraries,
and TCP/IP networking. It runs all application that a UNIX server system can,
including web servers like Apache, mail serving software like Sendmail, and
database servers like Oracle, Informix, or more open applications like MySQL
and PostgreSQL. Linux supports a wide range of file system types, and through
and Primary Domain Controller (PDC). With clustering technology, Linux can
scale up to handle the super computing loads required by many scientific and
The GNU/GPL
"GPL" stands for "General Public License". The most widespread such
license is the GNU General Public License, or GNU GPL for short. This can be
further shortened to "GPL", when it is understood that the GNU GPL is the one
intended.
The GPL grants the recipients of a computer program the rights of the free
software definition and uses copyleft ensure the freedoms are preserved, even
Linux Distributions
There are lots of distribution of Linux, Most common are from Mandriva,
RedHat, and Debian. Its flavors are RHEL, Centos, Fedora, Mandiva, Suse,
Ubuntu, FreBSD.
9
Each distribution offers the same base Linux kernel and system tools, but differ
on installation method and bundled applications. Each distribution has its own
on one.
The Red Hat distribution, by commercial vendor Red Hat Software, Inc. is one of
the most popular distributions. With a choice of GUI- and text-based installation
procedures, Red Hat 6.1 is possibly the easiest Linux distribution to install. It
offers easy upgrade and package management via the “RPM” utility, and includes
both the GNU Network Object Model Environment (GNOME) and the “K Desktop
Environment” (KDE), both popular GUI window managers for the X Window
System. This distribution is available for the Intel, Alpha, and Sparc platforms.
Project” is the darling of the Open Source community. It also offers easy upgrade
and package management via the “dpkg” utility. This distribution is available for
the Intel, Alpha, Sparc, and Motorola (Macintosh, Amiga, Atari) platforms.
Desktop Environment” (KDE), and also offers easy upgrade and package
management via the “YaST” utility. This distribution is available for both Intel and
Alpha platforms.
10
towards corporate users. With the new OpenLinux 2.2 release, Caldera has
raised the bar with what appears to be the easiest to install distribution of Linux
MandrakeSoft S.A., integrates the Red Hat or Debian distributions (your choice)
with additional value-add software packages than those included with the original
distributions.
simple installation procedure, but poor upgrade and package management. Still
based on the libc libraries but the next version will probably migrate to the newer
glibc. Recommended for users who are more technical and familiar with Linux.
Linux Intension
Frampton (2001) stressed that he had been using Linux for several years,
and he would like to think that he know a bit about the operating system and
what it can and cannot do. As he is an avid USENET reader, he followed the
latest developments and of course, the various flame-wars that invariably crop
11
that more than a few people believe. So, he run down a few of the more common
have the notion that, because a piece of software was written by volunteers with
no profit motive in mind, that the results must clearly be inferior to commercial-
grade offerings. This may have been true in the past that there was a lot of
freeware which was absolute garbage in the DOS and early Windows world, but
it is most certainly not true in recent days. The power of the Internet has made it
possible to bring together some of the brightest minds in the globe, allowing
collaboration on projects they find interesting. The people who have put a hand
packages are from a diverse background, and all of them have different personal
reasons for wanting to contribute. Some are hard-core hackers who develop for
the love of coding, others have a need for something like example, a network
traffic monitor for a LAN at work and decide to write it themselves, others are
academics and computer scientists who are using Linux for its research qualities.
code excluded, to the end-user, code used in Linux is scrutinized, debugged, and
improved upon by anyone who has the interest and ability. This act of peer-
review is one of the reasons that Linux offers the high reliability and high
performance that it does. Do not forget: The Internet itself was built and runs
almost exclusively on Open Source projects. The e-mail you exchange on a daily
basis with people around the world has an 80% chance of being handled on one
12
or both ends by Sendmail, the web pages you browse while “Surfin’ the Web” are
“There is no support for Linux.” He heard the myth somewhat sickens him.
experience with one very popular commercial operating system, where the
vendor’s so-called “support” was completely useless. First of all, there is support
for Linux. Yes, commercial support. There are some companies that can provide
as much support as you are willing to pay for; offering telephone and e-mail
support, many offering to come right to your door to deal with the problem!
However, in 99% of the situations you will run into with Linux, you will be able to
accomplish what you wish if you can simply get the answer to a question or two.
available. There lots of forum to look from when it comes to bugs and fixes. All
are available by just posting once issue and it was look upon by people in the
community who already encountered and had fix that specific problem.
Apache
software because it was “a patchy” server, made out of patches for the freely
available source code of the NCSA HTTPd Web Server. For a while after the
patches for the code, either to fix bugs or add features that they wanted. There
was a lot this code floating around and people are freely sharing it, but it was
completely unmanaged. After a while, Bob Behlendorf and Cliff Skolnick set up a
centralized repository of these patches, and the Apache project was born. The
project is still composed of a rather small core group of programmers, but anyone
is welcome to submit patches to the group for possible inclusion in the code. In
the last couple of years, there has been a surge of interest in the apache project,
partially buoyed by new interest in Open Source. It is also due, in part, to IBM’s
commitment to support and use Apache as the basis for company’s Web
has made more sense to use an established, proven Web server than to try to
write their own. The consequences of this interest have been stable version for
A board of directors, who are elected on the amount basis by ASF members,
overseas the company, This company provides a foundation for several different
Open Source Software development projects – including the Apache Web Server
The next table shows the current statistic growth of Apache users as Web
Server:
Table 1.
Percentage of Web Server Users
14
Using the File Transfer Protocol (FTP) is a popular way to transfer files
from machine to machine across a network. Clients and servers have been
written for all the popular platforms, thereby often making FTP the most
You can configure FTP servers in one of two ways. The first is as a private
user-only site, which is the default configuration for the FTP server; I will cover
this configuration here. A private FTP server allows users on the system only to
You can place access controls on these users so that certain users can be
allows anyone on the network to connect to it and transfer files without having an
account. Due to the potential security risks involved with this setup, you should
take precautions to allow access only to certain directories on the system (Pitts et
al, 1998).
15
Email Server
At least two components are involved in electronic mail. These are MTAs
and MTUs. MTA stands for Mail Transfer Agent, and MUA stands for Mail User
Agent.
The MTA is the server application that handles sending and receiving e-mail.
MTA after you press the Send button. Likewise, any incoming mail for you is
handled by the MTA. The MTA responsibilities include things such as the
following:
Querying outgoing mail so that client will not have to wait for the mail to
actually be sent.
Accepting mail for clients and placing that mail in a holding area until the
Main transfer is done with the protocol called SMTP, which stands for Simple
Mail Transfer Protocol. As the name suggests , the protocol is really quite simple.
It can send and received only plain text, and it uses relative simple commands to
The other necessary part of the e-mail system is a MUA, or Mail User
Agent. The MUA is the client that user actually interacts with. Common MUAs
with which you might be familiar are Microsoft Outlook, Eudora, Outlook Express,
SMTP
way of transferring mail over the Internet. The sendmail program provides the
services needed to support SMTP connections for Linux. Armed with a better
beginning with the various tasks that sendmail performs (such as mail routing,
with any large software package, sendmail has its share of bugs. Although the
bugs that cause sendmail to fail or crash the system have been almost
completely eliminated, security holes that provide root access are still found from
time to time
To understand the different jobs that sendmail performs, you need to know
a little about Internet protocols. Protocols are simply agreed-upon standards that
17
software and hardware use to communicate. Protocols are usually layered, with
higher levels using the lower ones as building blocks. For example, the Internet
Protocol (IP) sends packets of data back and forth without building an end-to-end
Transmission Control Protocol (TCP), which is built on top of IP, provides for
and the Simple Mail Transfer Protocol (SMTP). Together, TCP/IP provides the
basic network services for the Internet. Higher-level protocols such as the File
Transfer Protocol (FTP) and SMTP are built on top of TCP/IP. The advantage of
such layering is that programs which implement the SMTP or FTP protocols do
not have to know anything about transporting packets on the network and making
connections to other hosts. They can use the services provided by TCP/IP for
that job. SMTP defines how programs exchange e-mail on the Internet. It does
not matter whether the program exchanging the e-mail is sendmail running on an
programs implement the SMTP protocol correctly, they can exchange mail.
databases. Almost all database vendors support SQL while adding their own SQL
18
model, it was first made commercial by Relational Systems, now called Oracle
Corporation in late 1970s. D. Chamberlin of IBM first defined a language that was
the business user could define, query or manipulate database with simple
within other procedural languages (e.g., CoboL, C, Pascal etc.) saving significant
SQL has been created such that it is intuitive, simple (relatively speaking),
non-procedural (that is, one need not specify step-by-step instructions to execute
certain actions) and maps human's cognitive model. Ideally, programmers and
business users need not know how or where data is stored. They should be able
to specify what they want and how they want given their requirements (e.g.,
Furthermore, they should be able to specify according to how our human mind
into SOL statements may be difficult and the constructs may not be powerful
programming in COBOL or other languages and then used SOL (and SOL within
MYSQL
access, and process data stored in a computer database, you need a database
The SQL part of MySQL stands for "Structured Query Language" - the
Open Source Software. Open Source means that it is possible for anyone to use
and modify. Anybody can download MySQL from the Internet and use it without
paying anything. Anybody so inclined can study the source code and change it to
to their needs. MySQL uses the GPL (GNU General Public License) to define
what you may and may not do with the software in different situations.
MYSQL Usability
MySQL is very fast, reliable, and easy to use. It has a practical set of
features developed in close cooperation with our users. Many can find a
20
faster than existing solutions and has been successfully used in highly
development, MySQL today offers a rich and useful set of functions. The
connectivity, speed, and security make MySQL highly suited for accessing
interfaces. It also provide MySQL as a multi-threaded library which you can link
people will find that your favorite application or language already supports
[Link] social way to pronounce MySQL is “My Ess Que Ell" (not \my
History of MYSQL
We once started out with the intention of using mSQL to connect to our
tables using our own fast low-level (ISAM) routines. However, after some testing
21
we came to the conclusion that mSQL was not fast enough nor exible enough for
our needs. This resulted in a new SQL interface to our database but with almost
the same API interface as mSQL. This API was chosen to ease porting of third-
party code. The derivation of the name MySQL is not perfectly clear. Our base
directory and a large number of our libraries and tools have had the prefix \my"
for well over 10 years. However, Monty's daughter (some years younger) is also
named My. Which of the two gave its name to MySQL is still a mystery, even for
APIs for C, C++, Eiffel, Java, Perl, PHP, Python and Tcl
Fully multi-threaded using kernel threads. This means it can easily use
address, and various translation mechanisms have been devise to make this
possible. The DNS (Domain Name Service) is one such method, now use almost
a very common operation, what ever translation method we use must be very
by SRI (Stranford Research Institute) in the [Link] file, each line of which
contains the name and address of a host. Anyone could obtain a copy of this file
via FTP and let their resolver use it locally. This scheme worked well when there
were only few machines, it quickly grew impractical as more people began
progressively slowed down because the resolver look longer to search the list of
host each time. Changes to the database took forever to make and propagate
distributed fashion to accommodate its size and the need for frequent updates.
caches. Authority over portions of the database is delegated to people who are
able and willing to maintain them in a timely manner so that updates are no
DNS is a simple but delicate system that is vital to today’s internet. Errors
might manifest themselves in far from obvious ways, long after the changes that
help to make that admin experiences as a DNS admin are pleasant ones (Blum,
2002).
Network
24
information among DOD staff, research labs, universities, and contractors had hit
a major obstacle. The various entities had computer systems from different
that is the foundation of today’s Internet. During the 1970s, this network migrated
to a new, core protocol design that became the basis for TCP/IP. The mention of
that allows computers all over the world to communicate. It is growing at such a
phenomenal rate that any estimate of the number of computers and users on the
Internet would be out of date by the time this book went to print! Nodes include
universities, major corporations, research labs in the United States and abroad,
schools, businesses both large and small, and individually owned computers.
The explosion in past years of the World Wide Web has driven the Internet’s
programs, news on any topic, public forums and information exchanges, and e-
mail. Another feature is remote login to any computer system on the network by
25
using the Telnet protocol. Because of the number of systems that are
projects such as the 1997 decryption of the Data Encryption Standard are
possible only with the “everything is connected to everything else” behavior of the
Internet.
Moreover ISO (2000) defined many different types of computers are used
today, varying in operating systems, CPUs, network interfaces, and many other
Systems Interconnection (OSI) model. The OSI model does not specify any
the problem becomes more manageable, and each subtask can be optimized
Application
Presentation
26
Session
Transport
Network
Network
Data Link
Physical
The next table (Table 2) will identify Services Provided at Each OSI Layer:
Table 2.
OSS Layer
Layer Description
Physical (Layer 1) This layer provides the physical connection between a
computer system and the network. It specifies connector
and pin assignments, voltage levels, and so on.
Data Link (Layer 2) This layer “packages” and “unpackages” data for
transmission. It forms the information into frames. A
frame represents the exact structure of the data
physically transmitted across the wire or other medium.
Network (Layer 3) This layer provides routing of data through the network.
This layer provides sequencing and acknowledgment of
transmission. This layer establishes and terminates
communication links.
Transport (Layer 4) This layer establishes and terminates communication
links.
Session (Layer 5) This layer provides sequencing and acknowledgment of
transmission.
Presentation (Layer 6) This layer does data conversion and ensures that data is
exchanged in a universal format.
Application (Layer 7) This layer provides an interface to the application that a
user executes: a “gateway” between user applications
and the network communication process.
27
Each layer communicates with its peer in other computers. For example,
When information is passed from one layer down to the next, a header is added
to the data to indicate where the information is coming from and going to. The
header-plus-data block of information from one layer becomes the data for the
next. For example, when layer 4 passes data to layer 3, it adds its own header.
from layer 4 as data and adds its own header before passing that combination
down.
Before the advent of the OSI model, the U.S. Department of Defense
defined its own networking model, known as the DOD model. The DOD model is
closely related to the TCP/IP suite of protocols. TCP/IP does not make as fine
distinctions between the top layers of the protocol stack as does OSI. The top
three OSI layers are roughly equivalent to the Internet process protocols. Some
examples of process protocols are Telnet, FTP, SMTP, NFS, SNMP, and DNS.
The Transport layer of the OSI model is responsible for reliable data
protocols. Examples of these are TCP and UDP. TCP is used to translate
UDP is similar to TCP, except that it is not connection-oriented and does not
acknowledge data receipt. UDP only receives messages and passes them along
28
to the upper-level protocols. Because UDP does not have any of the overhead
related to TCP, it provides a much more efficient interface for such actions as
remote disk services. The Internet Protocol (IP) is responsible for connectionless
communications between systems. It maps onto the OSI model as part of the
Network layer, which is responsible for moving information around the network.
which determines the systems and the path to send the message. IP provides
the same functionality as the Network layer and helps get the messages between
systems, but it does not guarantee the delivery of these messages. IP may also
fragment the messages into chunks and then reassemble them at the
destination. Each fragment may take a different network path between systems.
If the fragments arrive out of order, IP reassembles the packets into the correct
IP Addresses
series of four octets. These octets each define a unique address, with part of the
An address starting with a zero references the local node within its current
workstation.
IP Addressing Classes
depending on the application and the size of an organization. The three most
common classes are A, B, and C. These three classes represent the number of
locally assignable bits available for the local network. Table 3 shows the
Table 3.
Network Classes Table
related networks. Class B addresses are used for large networks having more
than 256 nodes (but fewer than 65,536 nodes). Class C addresses are used by
reserved for multicast messages on the network, and class E is reserved for
Naming Network
planning. When you select names, keep network management and user
confusion. If not, There are plenty of room for imagination. Computer and
network names can be as simple as naming the workstations after the users,
such as Diane, Beth, or John. If you have many similar computers, numbering
them (for example, PC1, PC2, and PC128) may be appropriate. Naming must be
31
done in a way that gives unique names to computer systems. Do not name a
computer the computer in the north office and expect users not to complain. After
all, even the system administrator must type the names of computers from time
to time. Also avoid names like oiiomfw932kk. Although such a name may prevent
network intruders from connecting to your computer, it may also prevent you from
connecting to your workstation. Names that are distinctive and follow a theme
work well, helping the coordination of future expansion and giving the users a
sense of connection with their machines. After all, It is a lot easier to have a good
the Internet Protocol allows names up to 255 characters long, you should
avoid this, as some systems can not handle long names. (Each label can
the systems within the network. Following are examples of names that you can
use:
[Link]
[Link]
The following are examples of names that are difficult to use or remember:
[Link]
[Link]
room 345 on network 56 with network executive functions, but this type of
a particular node.
central, inner layers. The outer shells consist of systems such as router
which form common methods for securing and watching entire LANs or WANs.
This article will examine and illustrate the implementation of the inner shells, or
Although the outer security layers play a major role in overall security, the
security risks do not always stem from the outside, but from what your own
users, employees, and contractors are trying to do with your internal systems and
networks. Almost half of all system attacks come from within you LAN/WAN and
because these attackers know more about your internal systems, they are
systems.
To help protect your systems, you do not need to go out and buy some big,
pretty GUI. In fact, using off-the-shelf security can be a security risk in itself
Server Hardening
Of course, the first step in hardening any server is to always be sure the
server is fully patched and the required foundational security steps are properly
source environment, there is a plethora of tools to help harden the network stack
and services on our servers. Linux is particularly nice because it comes with its
own kernel-level IP/networking tools — ipchains and iptables — for doing stateful
packet inspection and control. Ipchains allow us to monitor incoming traffic and
make decisions based on what we see coming into the server from the network.
most businesses. Networked computer systems are used to share and access
key information and resources among millions users throughout all types and
confidential and/or intended for use only by the specific authorized individuals.
The ability of the network system to prevent authorized access and control
and stored data) this includes protection from damage, theft, and unauthorized
access, and user errors. With the use of Windows NT Workstation and Windows
options. The security implementation is easy for both administrators and end-
loaded with features and tools that make it easy to customize security for your
Write access)
The security model maintains security information for all user and group accounts
and the use of user accounts. You can create an almost unlimited number of
In support of all this control over the actions of a large number of users, It
the flexibility for network users to accomplish their daily tasks via a friendly
interface. Policies and profiles are used for this in the NT Security subsystem.
You can define security policies that apply to the domain as whole. The
trust relationship policy defines relationship with other domains. The user rights
policy controls access rights given to groups and user accounts. The account
policy controls how user accounts use passwords. The audit policy controls the
Auditing is build into WINDOWS NT. This allows you to track which user
account was use to attempt a particular kind of access to files or other objects.
Auditing also can be used to track logon attempts, system shutdowns and
restarts, and similar events. These features support the monitoring of the events
related to system security. Helping you identify any security breaches and
determine the extent and location of any damage. The level of audited events is
37
customizable to your needs. The security log in the event viewer can list and
SECURITY LEVELS
at all to the C2 level of security required by the U.S. government agencies. This
section describes three security categories- low (or more), medium and
maximum – and the security measures used obtain each level. These categories
are arbitrary, and you will probably combine characteristics of these categories
computer industry about security and levels of security. Why not just set
resources complicates users work with those resources. It is also extra work for
Here’s what can happen: Suppose only members of the account Payable
user group are allowed. To access AP records. A new person hired into that
group. For starters, someone needs to create an account for new user and add
that Accounts Payable group account. If the new Accounts Payable Group. The
new user cannot access the AP records and will be prevented from contributing
any meaningful work. Or, if the account is created and made a member of the
38
account payable group, you’ll need to consider what other access privileges are
Another possible problem may occur when the security too tight: User will
try to “beat the system” in order to work done. Take passwords, for instance. You
might decide to make them long and required that they be changed often. Users
may find it hard to remember their password and will write it down to avoid
forgetting it and being locked out of the network. Another dangerous password
are denied access from files they truly need to use are ”loaned” other employees
Low Security
Security may not be of much concern to you or your organization when the
security precaution might be unnecessary: System allows you to fully access the
Lambert (1997) also claims that for the simplest level of security, take the
same precautions you would with any piece of valuable equipment to protect
against theft. This can include locking the room when one can is using the
39
computer, or using a locked cable to attack the unit to wall. You might also want
to establish procedure for moving or repairing the computer so that the hardware
all peripherals from power spikes. Also perform regular disk scans and de-
For low security, none of the server security features are used. You might
even allow automatic logon to the Administration account (or any other user
account). This allows anyone with physical access to the computer to turn it on
and immediately have full access to its resources. Even if anyone chooses low
level security, do not let a bug get you! Take adequate precautions against
viruses-they can damage your data and prevent programs from operating. And
that virus may spread from your low-security computer to a more secure
machine.
Bear in mind that low security is not the norm. Most computers are used to
store sensitive and/or valuable data. This could be financial data, personal files,
deliberate.
40
Finally, if you do choose low security, keep in mind that the computer’s
users need to be able to do their work, with minimal barriers to the resources
Medium Security
security, however, you will want to ensure you include all of the following
Warning Banners
course, the goal of all security systems; unfortunately, however, simply making
the system secure may not be enough. You may also decide to need post
warnings that it is against company policy or even against the law to intrude upon
your system. In recent court cases, the argument has put forth that a logon
screen “invites” you to log on, and therefore a clever hacker is “justified” in
Before a user logs on to the system, server can display a message box
with the caption and text of your choice. This mechanism is often used to issue
warning message that users will be held liable if they attempt to use the
41
physically secured and protected like any other valuable equipment. Keep the
acceptable.
If you use a physical lock (a cable from the computer to a wall for
instance), keep the key in a safe place for additional security. Remember, if the
policies. User must form good logon habits, such as loggings off at the end of
each day and memorizing (rather than writing down) their passwords.
In the security system, a series of specific steps are taken to set up and
required to use computer. Secured Server provides a GUI tool, the user
manager, for creating, deleting, and disabling user accounts. User manager also
allows you to set password policies and other security system policies, and to
advocates that, to avoid accident changes to secure resources, the account with
the least privilege that can accomplish the task should be used. You use
separate accounts for administrative activities and user activity. All administrators
should have two user accounts: one for administrative task and one for general
activity. For example, viruses can do much more damage if activated from an
This account is the only one that can be locked out and is thus attractive to
hackers. By remaining account, you force hackers to guess the account name as
deleting any files, directories, or Registry keys. If the computer is for public use,
Under medium security, all users should always press Ctrl+Alt+Del before
logging on. “Trojan horse” programs are design to collect account passwords
and provided a secure logon screen. Users should also either log off or lock the
workstation if they will be away from the computer for any length of time.
Passwords
ally. Anyone who knows a username and the correct password can log on. Here
well. Do not write a password down; choose one that easy to remember.
Regular backups are a must to protect your data from hardware failures,
accidents, viruses, and other malicious tampering. Since files must be read to be
backed up, and they must be written to be restored, backup rights should be
limited to administrators. Also, you’ll want to assign accountability for the proper
When you establish an audit policy, you’ll need to consider the overhead
in disk space and CPU usage of the auditing options against the advantages of
these options. For medium security, you’ll want to at least audit failed logon
Maximum Security
security precautions are required for computers that contain highly sensitive data,
examining your physical network links, where the lines come into your office or
building. You may also want to control who has physical access to the computer.
45
As soon as you put a computer on the network, you add a funnel or route
to your system. This access port must be secured. Maintenance of user account
validation and object permissions is sufficient for Medium-level security, but for
maximum security you’ll need to make sure the network itself it secure.
The two risks to network connections are unauthorized network users and
building, you prevent at least minimize the chance of authorized taps. If the
cabling must pass through unsecured areas, use optical fiber links rather than
twisted pair, to foil attempts to tap the wire and collect transmitted data. Use data
user can physical access it. For maximum security on a computer that is not
physical secure (locked safely away), consider the following security measures:
The CPU should have a case that cannot be opened without a key. Store
power or reset switches. The most secure computer (other than those in locked
and guarded rooms) expose only the computer’s keyboard, monitor, mouse, and
46
(when appropriate) printer to users. The CPU and the removable media drives
operating system other than Windows NT. Power-on password are a function of
the computer hardware, not operating system, so check with your hardware
Registry Editor. The following lists of topics also need to be addressed for the
The most recognized (at least, in the U.S.) baseline measurement for a
secure operating system in the U.S. Department of Defense (DOD) criteria for a
C2 level secure system. C2 security sis a requirement for many U.S. government
Scripting Languages
from a written script such as a screenplay, where dialog is repeated verbatim for
every performance.
PHP
language for producing dynamic web pages. PHP is used mainly in server – side
application software.
1994, initially as a simple set of Perl scripts for tracking accesses on his resume.
Lerdorf initially created PHP to display his résumé and to collect certain data,
such as how much traffic his page was receiving. "Personal Home Page Tools"
48
was publicly released on June 8, 1995 after Lerdorf combined with Form
PHP generally runs on a web server, taking PHP code as its input and
MySQL database server. It can be dowloaded freely over the internet and
shown below:
INPUT PROCESS OUTPUT
Knowledge
Requirements Design
Computer (clone/branded)
Testing and Revision
Installers
Network Peripherals
Evaluation
administrators. This is to replace the and simplify the task of manually injecting
scripts by creating a preconfigured script that will run on the server to optimized,
protect, and secure servers from attacks and instability. The system administrator
will input the things that would be automated and it will run on specific time. Also
regular report will be e-mailed to him or the people that would allow receiving
such report. Since all are logged on the server, it has an auto filtering that will
extract the report daily, weekly or monthly on which the user wish to configure the
script. All the script that would be running requires basic skill on server
maintenance. Once all is set in to place, only updates need to be run and it can
be scheduled using a cron job. All wrong access will be on the system and will
use all desired function to correct the problem automatically. Error and intrusion
based from the logs will then be processed to come up with the design and
correct development of the system. With that an open source centralized server
Figure 1 illustrates the conceptual model of the proposed project. The inputs
services, basic networking and fire-walling, internet and Linux operating system
core.
51
well as perl and GCC to develop and execute codes. It is a scripting language
which can be freely downloaded over the internet and can be run with any
system will also use the Apache as a web server, Internet Explorer and firefox as
web browser.