NETWORK ANALYZER AND NETWORK MONITORING TOOL USING SNMP

ABSTRACT
          This project is organized in five chapters. Chapter one explains the introduction of the proposed project. It described the problem statement of the project, aim and objectives and expected contribution to knowledge.
          Chapter two presents a vivid literature review of the concept of network analyzer and related works.
Chapter three discuss the design methodology of this project.
          Chapter four deals with the implementation and the result and chapter five covers the conclusion, summary and recommendation aspect of this research work.


TABLE OF CONTENT
Title page
Certification
Dedication
Acknowledgement
Abstract
Table of content
List of figure
List of table
CHAPTER ONE
1.0     Introduction
1.1     Background  knowledge of project
1.2     Benefit of research
1.3     Scope of study
1.4     Aims and objectives
1.5     Definition of terms
CHAPTER TWO
2.0     Literature Review
2.1     Historical background of research
2.2     Review of related works
2.3     Use case diagram
2.4     Architecture of the proposed system
2.5     The features of Psniffer
2.6     Principle of packet sniffing

CHAPTER THREE
3.0     System design
3.1     Research methodology
3.2     Analyzer Wrapper and Core Parser
3.3     Data base server
3.4     Using the web interface
3.5     Dataflow diagram (DFDs)
CHAPTER FOUR
4.0     Implementation of the proposed system
4.1     Software requirement
4.2     Hardware requirement
4.3     Additional configurations
4.4     Implementation of the research
CHAPTER FIVE                                              
5.0     Summary
5.1     Conclusion and Recommendation
5.2     Future enhancement/recommendation
          References

CHAPTER ONE
1.0             INTRODUCTION
The Simple Network Management Protocol (SNMP) is a standard application layer protocol (defined by RFC 1157) that allows a management station (the software that collects SNMP information) to poll agents running on network devices for specific pieces of information. What the agents report is dependent on the device. For example, if the agent is running on a server, it might report the server’s processor utilization and memory usage. If the agent is running on a router, it could report statistics such as interface utilization, priority queue levels, congestion notifications, environmental factors (i.e. fans are running, heat is acceptable), and interface status.

All SNMP-compliant devices include a specific text file called a Management Information Base (MIB). A MIB is a collection of hierarchically organized information that defines what specific data can be collected from that particular device. SNMP is the protocol used to access the information on the device the MIB describes. MIB compilers convert these text-based MIB modules into a format usable by SNMP management stations. With this information, the SNMP management station queries the device using different commands to obtain device-specific information.

Network analyzer is a network layer attack consisting of capturing packets from the network transmitted by other computers and reading the data content in search of sensitive information like passwords, session tokens and confidential information. This could be done using tools called network sniffers; these tools collect packets on the network and, depending on the quality of the tool, analyze the collected data like protocol decoders or stream reassembling.

Packets in computer communications can be defined as a quantity of data of limited size. In Internet all traffic travels in the form of packets, the entire file downloads, Web page retrievals, email, all these Internet communications always occur in the form of packets. In the internet, packet is a formatted unit of data carried by a packet mode in computer network
This work develop a network analyzer that can capture network traffic and analyze it and allows user to take only the feature as needed with little memory usage for installation and store it in a file to use it later in his work, then this will reduce the memory that is used to store the data unlike available tools can only capture network traffic without analysis, while some require large memory size for installation. In addition, have a user friendly control interface (Awodele&Otusile, 2012).
A packet analyzer sometimes called a network analyzer, protocol analyzer or sniffer or Ethernet sniffer or wireless sniffer (Spangler, 2003), is a computer program or a piece of computer hardware that can intercept and log traffic passing over part of a network (Chan,
2002). Sniffer is used as an assistant of network management because of its monitoring and analyzing features which can help to troubleshoot network, detect intrusion, control traffic or supervise network contents.
            Network packets contain a lot of useful information about network activity that can be used as a description of the general network behavior. Network packet analyzers become a useful tool for system and network administrators to capture such kind of network information. In this report, anImplementation of a network packet analyzer based on tcpdump, a popular network packetsniffer, will be described. This fully configurable tool concentrates particularly on its flexibleinput and output options so that it can easily be incorporated into other network tools to performmore complicated tasks, such as real-time or offline network intrusion detection. Databasesupport is introduced in this tool as an output option for its well-known efficiency and convenience in handling huge amounts of information. A web front-end and the core packetanalyzer can be integrated to become a database-backed web application as discussed in thisreport.

1.1             Background knowledge of project
Network security is sometimes more than what people always thought it to be,malware, virus, Trojan, hackers. Network security could be caused by unintentionalhuman error and it could be compromised by human nature as well. A common network security problem (Employees) most organizations are facingsometimes has to do with the company’s employees and the various errors theymake. According to Dr. Michael E. Whitman, CISM, CISSP, and the author of thetextbook “Principals of Information Security, “Humans make mistakes; sometimes thatis due to inexperience or improper training, and sometimes it is because an incorrect assumption was reached. But regardless of the reason—and the lack of malicious intent—something as simple as a keyboarding error has the potential to cause a worldwide Internet outage”. (Whitman and Mattord 2012)The problem of piracy is another common network problem. Piracy is a situationwhere intellectual properties are compromised although there are technicalmechanisms that aid in enforcing copyright laws to tackle this problem.
However it is not only human errors that can cause problem to network security,problems can also be caused by natural forces like fire breakouts, earthquakes,floods lightning etc.
The ways network administrators think about securing networks has been changed byan increasingly dynamic and technically challenging risk environment.
New business models rely on open networks with multiple access points to conductbusiness in real time, driving down costs and improving response to revenuegenerating opportunity by leveraging on the ability to quickly exchange criticalinformation, share business files or folders and improve their competitive position.

1.2       BENEFIT OF RESEARCH
With increasing reliance on computer systems worldwide for data processing andinformation storage, the need for legitimate security of information and data cannot beoveremphasized. Un-authorized access, revelation or destruction of data can violateindividual privacy and even threaten the existence of an organization. Sinceinformation is regarded as the live wire of an organization, it is, therefore, necessaryto secure computer systems and the stored information.

1.3       SCOPE OF THE STUDY
The purpose of this project is to provide guidelines for organizations on planning and conducting technical information security testing and assessments, analyzing findings, and developing mitigation strategies. It provides practical recommendations for designing, implementing, and maintaining technical information relating to security testing and assessment processes and procedures, which can be used for several purposes—such as finding vulnerabilities in a system or network and verifying compliance with a policy or other requirements. This guide is not intended to present a comprehensive network information security testing or assessment program, but rather an overview of the key elements of technical security testing and assessment with emphasis on specific techniques using the software, and their benefits and limitations, and recommendations for their use.

1.4       AIMS AND OBJECTIVES
Since the evolution of attack is endless, this project gives an overview of the bestpractices in mitigating the known attacks and recommendation on how to preventreoccurrence attacks.
            The objectives of this work are to reveal and define the concept of attack and threatto computer network, to highlight different mitigating techniques used to circumventthreats and attacks, to illustrate the procedure to implement the best securitypractices, and to extend the practices of an outsider trying to gain access into thenetwork to the network engineer.

1.5       Definition of terms
There are common and uncommon network terms that would be used often on thisproject. Below are some terms and their definitions.
1) Computer Network: collection of computers that work together in other to allowsharing of resources and information
2) Vulnerability: is a weakness that compromises either the security or thefunctionality of a system.
3) Exploits: is the mechanism used to leverage vulnerability.
Examples include:
i. Password Guessing:Various methods used to discover computer password.
ii. Shell Scripts:It is a text file that contains a sequence of commands for a UMX-based operating system.
iii. Executable Codes: These are programs written in interpreted languages that require additional software to actually execute.
iv) Authentication: the process of confirming the true identity of a given user
v) Authorization: this is a process of permitting a user to access a certain resourcesor perform certain activities.
vi) Firewalls: is a set of related programs that protects the resources of a privatenetwork.
vii) Antivirus: is software that detects most viruses and many Trojans.
viii) Backdoor: this a compromised tool installed in other to allow attacker tocompromise system around any security mechanisms that are in place
ix) Countermeasures: Reactive methods used to prevent an exploit from successfullyoccurring once a threat has been detected.
x) Demilitarized Zone (DMZ): is a network area (a sub network) configured like afirmware to secure local area network (LAN)
xi) Packet analyzer/Sniffer:is a computer program or a piece of computer hardware that can intercept and log traffic passing over a digital network or part of a network.
xii) Network traffic:this are data’s in a network, and data is encapsulated in network packet
xiii)Packets: Network packets are units of data traveling in
xiv) Packet capture:Packet capture is the process of intercepting and logging traffic.
xv) SNMP: Is a protocol governing network management of the monitoring network devices and their functions.
xvi) MIB (Management info base): It is a formal description of a set of network object of can be managed using SNMP.
xvii) Network Analyzer:A specialized hardware device of software in a desktop or laptop computer that captures packets transmitted in network for routine inspection and problem detection.
xviii) Compilers: A compiler is a special program that processes statements written in a particular programming language and turns them into machine language or code that a computer processor uses.
xix)Network Security: Is a specialized field in computer networking that involves securing a computer network infrastructure.
xx) Packet: A block of data transmitted across a network.
xxi) Protocols: A set of rules governing the exchange or transmission of data between devices.

xxii) DFD (Data Flow Diagram): It is a graphical representation of the “flow” of data through an information system, modeling its process aspects.

For more Computer Science Projects click here
================================================================
Item Type: Project Material  |  Size: 53 pages  |  Chapters: 1-5
Format: MS Word   Delivery: Within 30Mins.
================================================================

Share:

Search for your topic here

See full list of Project Topics under your Department Here!

Featured Post

HOW TO WRITE A RESEARCH HYPOTHESIS

A hypothesis is a description of a pattern in nature or an explanation about some real-world phenomenon that can be tested through observ...

Popular Posts