Sunday, April 25, 2010

WIRELESS APPLICATION PROTOCOL
INTRODUCTION
The primary means of communicating information of these days are voice and Internet. The unlimited accesses to Internet and sheer number of people connected to the Internet have made industry captain realize its potential. The industry now plans its marketing and communication strategies around the Internet. Today every banking to education, research to health-care is affected by it. E-mail is the way to communicate today. Practically who use the Internet uses E-mail. The wireless technologies and the Internet were growing separately. The wireless industry initially struggled within a number of issues like low bandwidth and low connection stability, to bring Internet to its users. They came together to form a common forum to tackle these issues. This forum is called the WAP.The wireless application protocol.

The Wireless Application Protocol is a standard developed by the WAP Forum, a group founded by Nokia, Ericsson, Phone.com (formerly Unwired Planet), and Motorola. The WAP Forum’s membership roster now includes computer industry heavyweights such as Microsoft, Oracle, IBM, and Intel along with several hundred other companies. According to the WAP Forum, the goals of WAP are to be:
• Independent of wireless network standard.
• Open to all.
• Proposed to the appropriate standards bodies.
• Scalable across transport options.
• Scalable across device types.
• Extensible over time to new networks and transports.

WAP defines a communications protocol as well as an application environment. In essence, it is a standardized technology for cross-platform, distributed computing. It sound similar to the World Wide Web. WAP is very similar to the combination of HTML and HTTP except that it adds in one very important feature: optimization for low-bandwidth, low-memory, and low-display capability environments. These types of environments include PDAs, wireless phones, pagers, and virtually any other communications device.
Wireless Application Protocol (WAP) is a result of continuous work to define an industry-wide specification for developing applications that operate over wireless communication networks. The scope for the WAP Forum is to define market is growing very quickly and reaching new customers and providing new services. To enable operators and manufacturers to meet the challenges in advanced services, differentiation, and fast/flexible service creation, WAP defines a set of protocols in transport, session and application layers.


WIRELESS INTERNET TELEPHONY

INTRODUCTION

Internet Telephony can be defined as real time voice or mult-imedia communication over the internet or more generally over packet switched network. two telecommunication standardizations bodies are now used IP (session initiation protocol ) as basis to make wireless Internet telephony a reality in the context of third generation wireless telecom- munication network.Millions of highly mobile end-user who are frequently change their location will access Internet telephony services using wide range of wireless devices . Advances service anything that goes beyond two party call . It may or may not be related telephony like call diversion , call transfer, blend telephony etc.In wireless telephony end-user need to have to access these services from any terminal or from anywhere .this requirementis called universal access.

This paper presents the design ,the implementation , & the evaluation of an advanced service architecture for wireless Internet telephony.
The architecture relies on mobile agent that act as folder & carries services . Carrying services mobile agent raises the issue of agent upgrading when end-user subscribes or unsubscribes to service .mobile agent is software program whose main characteristics is ability to move from node to node on network during execution . The mobile agent provide universal access requirement with an efficient manner . They can relocate to new devices being used by end-user or the SIP proxy H.323 gatway that is close to the end-user. A mobile agent that we call MSA (mobile service agent) &there are also some important elements like Service Creation Unit , Service Management Unit & S - ervice Publication Unit .The SCU handles the creation of new services . The SMU manages user subscription creates and maintains MSAs. The service publication unit is the interface of the system to the external world and can be simple web server . our architecture tackles the issue by proposing and eval -uating two novel schemes :agent swapping and on-the fly updating . Althogh wireless Internet Telephony is our prime targrt ,the architecture is ,to a large extent ,independent of the lyiny network and therefore applicable to Internet telephony in general.


WAVELET TRANSFORMS

ABSTRACT

Practically all signals are non-stationary and are encountered with the problem of providing better time and frequency resolutions due to Heisenberg’s uncertainty principle. This paper focuses on Wavelet Transform techniques, which are only a decade old and are providing better resolutions and time-frequency representation of the signals which the old Fourier Transform and revised Short Term Fourier Transform failed. This paper discusses the drawbacks of FT and STFT and how the Wavelet Transforms has overcome them, along with the vast fields of it’s applications.


VOICE OVER INTERNET PROTOCOL

INTRODUCTION

VoIP (voice over IP - that is, voice delivered using the Internet Protocol) is a term used in IP telephony for a set of facilities for managing the delivery of voice information using the Internet Protocol (IP). In general, this means sending voice information in digital form in discrete packets rather than in the traditional circuit-committed protocols of the public switched telephone network (PSTN). A major advantage of VoIP and Internet telephony is that it avoids the tolls charged by ordinary telephone service

VoIP is therefore telephony using a packet based network instead of the PSTN (circuit switched).

During the early 90's the Internet was beginning its commercial spread. The Internet Protocol (IP), part of the TCP/IP suite (developed by the U.S. Department of Defense to link dissimilar computers across many kinds of data networks) seemed to have the necessary qualities to become the successor of the PSTN.

The first VoIP application was introduced in 1995 - an "Internet Phone". An Israeli company by the name of "VocalTec" was the one developing this application. The application was designed to run on a basic PC. The idea was to compress the voice signal and translate it into IP packets for transmission over the Internet. This "first generation" VoIP application suffered from delays (due to congestion), disconnection, low quality (both due to lost and out of order packets) and incompatibility.

VocalTec's Internet phone was a significant breakthrough, although the application's many problems prevented it from becoming a popular product. Since this step IP telephony has developed rapidly. The most significant development is gateways that act as an interface between IP and PSTN networks.


VIRTUAL REALITY

ABSTRACT:

The term 'Virtual Reality' (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989). Other related terms include 'Artificial Reality' (Myron Krueger, 1970s), 'Cyberspace' (William Gibson, 1984), and, more recently, 'Virtual Worlds' and 'Virtual Environments' (1990s).

The relationship between our actions and their perceivable results is ruled by what we call the laws of nature. It is general understanding that our actions act upon real objects, which react according to the laws of nature, what then can be perceived. Virtual Reality Facilities (VRFs) simulate the action perception relationship in a physically correct manner but without involving real objects or real events. Just the same do mathematical models of nature (physical theories). So it stands to reason that VRFs can be considered as analog models of nature.
If a physical theory is false its predictions cannot be verified. If a VRF were false we would have strange and unusual perceptions as if different laws of nature would be valid. It is suggesting to say that we would fail to survive in nature when using a false mathematical as well as a false analog model. So, an analog model of nature can be useful even if it is not 'true'.

Virtual Reality is an enabling technology that has wide applications in training, product design, etc. Virtual reality (VR) technology is being used to resolve problems in real-world situations. The National Aeronautics and Space Administration (NASA) is using VR to train astronauts to repair the Hubble Space Telescope.

In this talk, we present a brief introduction into Virtual Reality as a human centered interface technology.


MATHEMATICAL ANALYSIS OF HYDROPNEUMATIC SUSPENSION SYSTEMS

Abstract:

Every one of us might have seen the heavy trucks running on the roads. These vehicles have efforts on their axles very close to the allowed limits, mainly driving on rough roads or during cornering. In this case, the use of conventional suspension systems like those using McPherson Struts, Multi Link Suspension, Trailing Arm Suspension, 4- bar suspension etc. can increase the axle’s overload phenomena. Hydropneumatic suspension system, when used in these vehicles, takes an asset in providing a better load distribution per axle, decreasing the overload problem and thereby increasing the ride comfort. The well known problem of the damper co-efficient changes due to load variation in vehicles using conventional suspension system is even more observable when a hydropneumatic spring is applied due to its non linearity, as opposed to the several advantages this spring type brings. This problem is more emphasized in vehicles with a large mass range when they pass from a no load condition to a full load condition.

In this study, a Mathematical model of the hydropneumatic spring stiffness behaviour was developed. The various factors or parameters that influence the spring stiffness behaviour have been mathematically found out. Also in this paper, a methodology for primary specification of critical parameters of a hydropneumatic suspension system is presented.


TRANSPARENT SWITCHES

INTRODUCTION

A large communication network can be pictured as having two main parts: a transmission plant and switching facilities. The first transports traffic between network nodes, while second routs traffic over the transmission plant to get it from the source to destination. In recent years optical transmission technology has progressed very faster.

Transparent switches are the switches in which optical switches are routed without intermediate conversion into electronic form.

These switches are also called photonic or transparent switches .Of course these switches are cheap and capable of dealing with thousand of inputs and outputs that traditional electronic switches handle so well.

Several approaches are being explored for making these devices. These include array of tiny movable mirrors, known as microelectromechanical systems, or MEMS, and unit based on holographic crystal, liquid crystal total internal reflection and polarization dependent materials. The problem is to figure out which all-optical switching technology to use in what application. Optical switches are sometimes referred as O-O-O switches. Unlike O-E-O switches, present all optical switches are not capable of separately routing each of low data streams carried by a single input wavelength.


TIDAL POWER THE FUTURE WAVE OF POWER GENERATION

ABSTRACT

Renewable energy can be used to decrease global dependence on natural resources, and tidal power can be the primary form of renewable power utilized. Built upon steam turbine knowledge, tidal turbines draw on innovative technology and design to operate on both the inflow and outflow of water through them. Two case studies, Annapolis Royal and La Rance, prove that tidal power plants are capable of producing reliable and efficient power. Problems, such as initial cost and power transportation hinder future implementation of tidal power plants. This paper emphasizes the possibilities of utilizing the power of the oceans by pollution free, tidal Power generation. Tidal power utilizes twice the daily variation in sea level caused primarily by the gravitational effect of the Moon and, to a lesser extent by the Sun on the world's oceans. The Earth's rotation is also a factor in the production of tides.


TELE-IMMERSION

ABSTRACT:

Tele-immersion is an advanced form of virtual reality that will allow users in different places to interact in real time in a shared simulated environment. This technology causes users to feel as if they were in the same room. The tele-immersion technology uses a "tele-cubicle" which is equipped with large screens, scanners, sensors, and cameras. The tele-cubicles are linked together in real-time so that they form one larger cubicle. Through the virtual environment, participants are able to interact with other group members. Also, virtual objects and data can be passed through the walls between participants, and placed on the shared table in the middle for viewing. Tele-immersion has the potential to significantly

impact educational, scientific, manufacturing, and many other fields.
• Interactive Scientific Visualization
• Molecular Engineering
• Virtual nuclear test.
• Education and Training
• Virtual classroom.
• Army training.
• Art and Entertainment.
• Virtual game
• Industrial Design,
• Architectural Review and Evaluation
• Remote design collaboration
All of these researchers use Internet 2.Internet 2 is the successor to the "commodity Internet", as the existing Internet is now known. Internet 2 is a collaborative project, overseen by the University Corporation for Advanced Internet Development, and worked on by 130 US universities and a number of government agencies and corporate sponsors.


Super worms and Crypto virology: a Deadly Combination

Abstract

Understanding the possible extent of the future attacks is the key to successfully protecting against them. Designers of protection mechanisms need to keep in mind the potential ferocity and sophistication of viruses that are just around the corner. That is why we think that the potential destructive capabilities of fast spreading worms like the Warhol worm, Flash worm and Curious Yellow need to be explored to the maximum extent possible. While re-visiting some techniques of viruses from the past, we can come across some that utilize cryptographic tools in their malicious activity. That alarming property, combined with the speed of the so-called “super worms”, is explored in the present work. Suggestions for countermeasures and future work are given.


STEGANOGRAPHY

ABSTRACT

Steganography (a rough Greek translation of the term Steganography is secret writing) has been used in various forms for 2500 years. Steganography is the art and science of hiding information by embedding messages within other, seemingly harmless messages. It has found use in variously in military, diplomatic, personal and intellectual property applications. Briefly stated, steganography is the term applied to any number of processes that will hide a message within an object, where the hidden message will not be apparent to an observer. This paper will explore steganography from its earliest instances through potential future application.

This paper introduces steganography by explaining what it is, providing a brief history with illustrations of some methods for implementing steganography. Though the forms are many, the focus of the software evaluation in this paper is on the use of images in steganography. We have even discussed various secret communication methods used, its comparison with Cryptography and Digital Watermarking. Finally, future projections on this perfect file encryption technique are made, along with few explanations using our own software and programs.


SPYWARE

ABSTRACT

Millions of computer users are being watched, not just by employers and Organizations, but by the software that they use frequently without their knowledge. This spyware has become the center for collecting of the private data and threatens the corporate secured data. Even it can change computer settings, resulting in slow connection speeds, different home pages, and loss of Internet or other programs.

In an attempt to increase the understanding of spyware,we have to understand the “What exactly is spyware? How does it work? What is its impact on users and the businesses that employ them?How to prevent them?are discussed.


Speech Recognition

Abstract:

Language is man's most important means of communication and speech its primary medium. Speech provides an international forum for communication among researchers in the disciplines that contribute to our understanding of the production, perception, processing, learning and use. Spoken interaction both between human interlocutors and between humans and machines is inescapably embedded in the laws and conditions of Communication, which comprise the encoding and decoding of meaning as well as the mere transmission of messages over an acoustical channel. Here we deal with this interaction between the man and machine through synthesis and recognition applications.
The paper dwells on the speech technology and conversion of speech into analog and digital waveforms which is understood by the machines

Speech recognition, or speech-to-text, involves capturing and digitizing the sound waves, converting them to basic language units or phonemes, constructing words from phonemes, and contextually analyzing the words to ensure correct spelling for words that sound alike. Speech Recognition is the ability of a computer to recognize general, naturally flowing utterances from a wide variety of users. It recognizes the caller's answers to move along the flow of the call.

We have emphasized on the modeling of speech units and grammar on the basis of Hidden Markov Model. Speech Recognition allows you to provide input to an application with your voice. The applications and limitations on this subject has enlightened us upon the impact of speech processing in our modern technical field.

While there is still much room for improvement, current speech recognition systems have remarkable performance. We are only humans, but as we develop this technology and build remarkable changes we attain certain achievements. Rather than asking what is still deficient, we ask instead what should be done to make it efficient….


x

SOS TRANSMISSION
Through Cellular phones to save Accident Victims
-Boon for the cellular phone users
(MOBILE COMMUNICATION)
Abstract:
This paper describes an ORIGINAL IDEA to help cellular phone users caught in an accident. The idea has been developed keeping in mind the considerations of cost and compatibility with existing system. The Short Message Service or SMS as it is popularly referred to, is made use of for this purpose.

The solution offered is the Force-Transducer method. The victim is assumed to be unconscious and the accident is detected automatically. Detailed simulation results at a scaled down level are provided for this solution. The threshold level is set based on data collected from the experiments.

One major problem in such design is the technique to find the victim’s position. The Global Positioning System (GPS) is found to be costly. So, an unorthodox design using Radio Direction Finders (RDF) and beacon signals is described. The Goniometer or Crossed Loop Antenna is used for this purpose. This reduces cost effectively when compared with the GPS system.

The paper proceeds to suggest an abstract view of the software robot required to perform the Save Our Souls (SOS) message routing task. It uses a special hierarchical message dispatch system wherein people nearby and more likely to help are contacted. The robot also acts as a proxy to the victim and monitors responses for him.

This paper as a whole gives a cost-effective, high performance system which can be introduced in the market if any of the cellular companies are willing to encourage it.


SOFT INSTRUMENTATION

Abstract -
A number of software packages are now available to aid in the generation of data acquisition and control applications. Just as spreadsheets are tools which help in the manipulation and presentation of data, so the software reviewed here are tools for generating applications. While the details of how the packages work vary from one to another, they do have two common features. Firstly they have some graphical editor which allows one to generate panels (screens), which the user will interact with as the application is progressing, and secondly they have some application editor for specifying - what processing is to take place in an application and when it is to do so.


SOFT COMPUTING IN INTELLIGENT MULTI-MODAL SYSTEMS

ABSTRACT

In this paper we will describe an intelligent multi-modal interface for a large workforce management system called the smart work manager. The main characteristics of the smart work manager are that it can process speech, text, face images, gaze information and simulated gestures using the mouse as input modalities, and its output is in the form of speech, text or graphics. The main components of the system are a reasoner, a speech system, a vision system, an integration platform and an application interface. The overall architecture of the system will be described together with the integration platform and the components of the system which include a non-intrusive neural network based gaze tracking system. Fuzzy and probabilistic techniques have been used in the reasoner to establish temporal relationships and learn interaction sequences.


Single Event Errors

ABSTRACT
Satellites and other space equipment are constantly prone to dangers posed by solar radiation and cosmic rays. The effect of cosmic rays on semiconductor materials results in errors which sometimes lead to devastating consequences. These errors, classified into soft and hard errors are to be kept in mind while designing the electronic equipment that is to be used in space and for other precise equipment. Single event errors are such anomalies in computing systems caused by the cosmic rays emanating primarily from the sun. Single event upsets were first detected by technicians manning electronic equipment after a nuclear test. Eliminating these errors involves testing the vulnerability of the equipment in the presence of different kinds of radiation in different kinds of materials and in different conditions.

The method of preventing these errors is two fold - one called ‘hardening’ and the other, ‘logical’. Technologies like heavy ion testing are being used to predict the behavior of electronic materials when exposed to ionizing radiations. The data obtained from these tests is used to determine the best materials to shield the electronic circuits from the radiations in the case of ‘hardening’. In the case of logical techniques, usage of parity bits and other redundancy checks to counter the effects of the radiation is followed.

This paper is an effort by the authors to introduce the topic of single event errors and discuss the possible ways of avoiding and/or correcting them.


SATELLITE COMMUNICATION

ABSTRACT

The transfer of information from source to destination i.e transmitter to receiver is called the communication. Basically communication is possible in two ways they are wire communication and the other one is wireless communication.

The satellite communication is a best example for the wire less communication.

In this paper we are first giving a brief satellite history and next why we are using satellite for communication and the orbital model. How the satellites stay in orbits and orbit types. After we are concentrating how an artificial satellite is launched. And what components required for satellite designers. And after we are giving few applications and key research challenges.


Bio-inspired Robotics

Introduction:

Our approach is characterized with a strong inclination for biological inspiration in which examples in nature — social insects in particular — are used as a way of designing strategies for controlling mobile robots. This approach has been successfully applied to the study of task, namely, Ant’s algorithms used in computer networks for routing data between Routers.
This phenomenon found in ants to derive the necessary behaviors for accomplishing this task. We study a species of ant known to possess this capability.

“bio-Computing” is a way “to understand how the relation of brain, body and environment produce behavior, to clarify the essential problems posed, and to devise and test hypotheses under realistic conditions” social insects were capable of successfully navigating and acting in the face of uncertain and unpredictable environments. It was reasoned that if a single robot required complex systems and techniques in order to perform in a reliable manner, then perhaps intelligent systems could be designed with many “simpler” robots using a minimalist approach to sensing and actuation; where group behavior is an emergent property and control is decentralized. Could system reliability be achieved by trading complexity for redundancy coupled with ”randomness” used to explore possible solution paths, which are often traits found in social insect colonies? May be, biology can teach us a thing or two about engineering swarms of simple interacting robots, and the theoretical foundations developed to model and explain these behaviors found in insect colonies can be used to underpin a more rigorous approach to collective robot design. Nature has already demonstrated the feasibility of this approach by way of the social insects.


Real-Time Reactive Control Layer Design for Intelligent Silver-Mate Robot on RTAI

Abstract:

Intelligent robots are capable of handling complex tasks that includes recognition of vocal commands, logical inference, autonomous navigation and manipulation, etc. To accomplish intelligent behaviors, researchers have proposed a number of control software architectures such as tripodal schematic control architecture (TSCA). To achieve real-time performance for robot's navigation, we have implemented software components in the reactive layer of TSCA on RTAI (Real-Time Application Interface) for our intelligent robot. In this article, we present our structures of the reactive layer components. Real-time performance of the designed reactive control layer is demonstrated via experimental results.


Real Time Task Scheduling

Introduction

A real time system is used when rigid time requirements have been placed on the operation of a processor or the flow of data; thus it is often used as a control device in a dedicated application.

The purpose of real time computing is to execute, by the appropriate deadlines its critical control tasks.The allocation/scheduling problem is: Given a set of tasks, task precedence constraints, resource requirements, task characteristics and deadlines we are asked to devise a feasible allocation/schedule on a given computer.

It’s an ongoing topic of research and the vast majority of assignment/scheduling problems on systems with more than two processors are NP-complete.


Saturday, April 24, 2010

QUANTUM COMPUTERS

INTRODUCTION:

Behold your computer. Your computer represents the culmination of years of technological advancements beginning with the early ideas of Charles Babbage (1791-1871) and eventual creation of the first computer by German engineer Konrad Zuse in 1941. Surprisingly however, the high speed modern computer sitting in front of you is fundamentally no different from its gargantuan 30 ton ancestors, which were equipped with some 18000 vacuum tubes and 500 miles of wiring! Although computers have become more compact and considerably faster in performing their task, the task remains the same: to manipulate and interpret an encoding of binary bits into a useful computational result. A bit is a fundamental unit of information, classically represented as a 0 or 1 in your digital computer. Each classical bit is physically realized through a macroscopic physical system, such as the magnetization on a hard disk or the charge on a capacitor. A document, for example, comprised of n-characters stored on the hard drive of a typical computer is accordingly described by a string of 8n zeros and ones. Herein lies a key difference between your classical computer and a quantum computer. Where a classical computer obeys the well understood laws of classical physics, a quantum computer is a device that harnesses physical phenomenon unique to quantum mechanics (especially quantum interference) to realize a fundamentally new mode of information processing.
A quantum computer is one which exploits quantum-mechanical interactions in order to function; this behavior, found in nature, possesses incredible potential to manipulate data in ways unattainable by machines today. The harnessing and organization of this power, however, poses no small difficulty to those who quest after it.
Subsequently, the concept of quantum computing, birthed in the early 80's by physicist Richard Feynman, has existed largely in the realm of theory. Miraculous algorithms which potentially would take a billionth of the time required for classical computers to perform certain mathematical feats, and are implementable only on quantum computers, as such have not yet been realized. A two-bit quantum system, recently developed by a coalition of researchers, constitutes the sole concrete manifestation of the idea.

In a quantum computer, the fundamental unit of information (called a quantum bit or qubit), is not binary but rather more quaternary in nature. This qubit property arises as a direct consequence of its adherence to the laws of quantum mechanics which differ radically from the laws of classical physics. A qubit can exist not only in a state corresponding to the logical state 0 or 1 as in a classical bit, but also in states corresponding to a blend or superposition of these classical states. In other words, a qubit can exist as a zero, a one, or simultaneously as both 0 and 1, with a numerical coefficient representing the probability for each state.


NETWORK SECURITY - QUANTUM CRYPTOGRAPHY

ABSTRACT

Why do we need a Network Security? Because in networked systems, the major security risks occur while conducting business on the Net; The following are some of the security risks occur: unauthorized access, Eavesdropping, Password sniffing, spoofing-spoofing, Denial of Service, virus attack, System modification, Data modification, Repudiation, E-mail bombing. One of the Methods to secure the information is Cryptography. Controls to protect data transmitted over the telecommunication lines, is mainly through appropriate Encryption techniques. The subject Cryptography deals with the encryption and decryption procedures. Encryption is the process of scrambling information so that it becomes unintelligible and can be unscrambled (reversed) only by using keys. Encryption is the achieved using a Symmetric Encryption or Asymmetric Encryption. In Symmetric Encryption (Single-key Cryptography), a single key is used encrypt as well as to decrypt. In Asymmetric Encryption (Public-key cryptography), two keys namely public and private key are used for encryption and decryption. The paper presentation is on the Network security-Quantum cryptography. Quantum cryptography is a new method, which is efficient and fastest of all methods to secure the information. In this Quantum cryptography, main concept is Quantum theory of light, polarization, the foundation of Quantum cryptography lies in the Heidelberg’s uncertainty principle which states that certain pairs of physical properties are related in such a way that measuring one property prevents the observer from simultaneously knowing the value of other. Quantum cryptography is an effort to allow two users of a common communication channel to create a body of shared and secret information.


PRESSURE AND LEVEL MEASUREMENT SYSTEMS

ABSTRACT

The sensor is a transducer that converts the measurand into signal carrying information.

The present paper describes the construction of a fiber optic microbend sensor for the measurement of pressure and a water level sensor, which is a pulley and counterweight version of the float type level sensors. Optical fibers used for transmission of digital signals have a constant and lower attenuation. However, attenuation in optical fibers still occurs. Losses in fiber optic cables can be due to absorption, scattering or excessive bending. Bending losses due to external forces can be used to sense changes in the measurand for a fiber optic sensor. Micro bend sensors are intensity modulated type of fiber optic sensors. In these sensors, the light emitted from an optical source is carried along a fiber, its intensity is modified at the transducer and the light is returned to an optical detector.

Level measurement is an integral part of process control, and maybe used in a wide variety of industries. Float type sensors are liquid level sensors, which can operate well in a wide variety of liquids. The float and pulley gauge provides an ex¬cellent method of measuring large changes in level with accuracy. It has the advantage that the scale can be placed for convenient reading at any point within a reasonable distance of the tank or vessel. This paper suggests a float type of sensor that gives the level change as a function of the resistance change.


PLC Based sequential batch process control system

ABSTRACT

Programmable logic controllers are extensively used in industries for controlling sequence of actions of the process. The sequence of process flow is decided for controlling the parameters like level and temperature. The brain of the system is PLC .Appropriate hardware for interfacing the process to the controller is developed for controlling the level ant temperature of the process. For controlling sequence of actions ladder diagram is developed .
Programmable Logic Controller (PLC) is designed to operate in real time environment. It has been observed that in many industries distributed control system and PLCs are extensively used. PLC based sequential batch process control (PLCSBC) is a laboratory type setup. This setup will be useful for demonstrating the use of PLC in sequential control operations in industry and development of ladder diagram for particular application. The system under consideration is designed to carry out sequence of events. The process under consideration is Batch process, whish is controlled by PLC.