• Nenhum resultado encontrado

UNMANNED AERIAL VEHICLES IMAGERY FOR MONITORING INTRUDER IN A (DENSE TERRAIN) WAR ZONE

N/A
N/A
Protected

Academic year: 2017

Share "UNMANNED AERIAL VEHICLES IMAGERY FOR MONITORING INTRUDER IN A (DENSE TERRAIN) WAR ZONE"

Copied!
8
0
0

Texto

(1)

UNMANNED AERIAL VEHICLES

IMAGERY FOR MONITORING

INTRUDER IN A (DENSE TERRAIN) WAR

ZONE

1 K. Suresh Kumar, Assistant Professor, 2A. Vijayaraj, Assistant Professor

ksureshmtech@gmail.com satturvijay@yahoo.com

Department of Information Technology Saveetha Engineering College

Saveetha Nagar, Thandalam, Chennai – 602105. Tamil Nadu- India

ABSTRACT

This paper presents the practicality of using embedding devices to autonomously fly a remote controlled helicopter which can be used in Defence. The goal of the paper is to maintain a stable hover using cheap embedding devices when used on an inexpensive small helicopter. We discuss various design decisions and challenges concerning hardware, software, and image processing algorithms. The problem of unmanned flight proved more difficult than expected, but the paper served well as a proof-of-concept that truly autonomous flight could be obtained using mounted Camera and embedded devices. Through the use of mounted sensors, the embedded device responds to the environment and corrects its flight in real time. Development of a suitable lightweight system in which a sensor is airborne for carrying out surveillance by GSM (Mobile communication). The sensor should remain airborne for a minimum of 2 minutes at a minimum height of 30 meter and above to do imaging of a proportionate area below. Recognizable real time video information should be transmitted to the ground receiver point suitably located in the observation area. Sensor should be able to detect man-sized objects in above-mentioned conditions. Proposed solution should take up design of configuration and identification of suitable options for sensor, data link, ground observation & control points and other support system(s). System configuration details comprising of sensor, data link, observation, data processing mechanism and support system should form part of the design.

Key Words: Embedded Systems, Image processing, Mobile Communication, FBUS protocol 1. INTRODUCTION

Now there is lot of emphasis in applying information and communication technologies to solve day-to-day problems. Scientists and engineers are trying to make the work of the governments easier. Our paper describes a system that is suitable for many applications.

Our system has an unmanned aerial vehicle which does the work of humans for military monitoring from the air. This will collect and provide information necessary for defense. Small unmanned vehicle lighter than aircraft and fixed wing air planes will fly over it and take pictures of areas in military borders.

The goal is to maintain a stable hover using cheap embedding devices when used on an inexpensive small helicopter. We discuss various design decisions and challenges concerning hardware, software, and image processing algorithms.

(2)

2. EXISTING SYSTEM

The earliest UAV was A. M. Low's "Aerial Target" of 1916. A number of remote-controlled airplane advances followed, including the Hewitt-Sperry Automatic Airplane, during and after World War I, including the first scale RPV (Remote Piloted Vehicle), developed by the film star and model airplane enthusiast Reginald Denny in 1935. With the maturing and miniaturization of applicable technologies as seen in the 1980s and 1990s, interest in UAVs grew within the higher echelons of the US military. UAVs were seen to offer the possibility of cheaper, more capable fighting machines that can be used without risk to aircrews. Initial generations were primarily surveillance aircraft, but some were fitted with weaponry. Early UAVs used during the Vietnam War after launch captured video that was recorded to film or tape on the aircraft. These aircraft often were launched and flew either in a straight line or in preset circles collecting video until they ran out of fuel and landed. After landing, the film was recovered for analysis. Because of the simple nature of these aircraft, they were often called drones. As new radio frequencies gained more available, UAVs were often remote controlled and the term "remotely piloted vehicle" came into vogue.

3. PROPOSED SYSTEM

Real-time video images and videos have become an increasingly important source of information for remote surveillance, intelligence gathering, situational awareness, and decision-making. With airborne video surveillance, the ability to associate geospatial information with imagery intelligence allows decision makers to view the geographic context of the situation, track and visualize events as they unfold, and predict possible outcomes as the situation develops. Airborne video surveillance (AVS) technology is critical to achieve more ubiquitous and persistent surveillance.

Requirements for High-resolution target tracking and recognition capabilities are provided by our current UAV system. Recent technological advances such as hardware miniaturization of sensors, electro-mechanical control, aerospace design, and wireless video communication, have enabled the physical realization of small and lightweight UAVs. We refer to this type of UAV as micro-UAV (M-UAV). Compared to conventional UAV systems, M-UAVs have the unique advantages of low-cost and rapid deployment. Their flexibility and ability to fly at low altitudes facilitates the collection of detailed video information for target tracking, identification, recognition and tracking.

A great deal of progress has been made in the physical design of M-UAVs and in the development of vision technologies for autonomous flight control. However, the deployment and control of a group of autonomous M-UAVs for large-scale video surveillance, situational awareness, and target tracking remains very challenging. Tracking can be achieved with no moving parts. Typical ranges are out to about 40 miles depending upon transmitter power, antenna a+nd altitude of the aircraft. The new tracking system has been field tested with excellent results. The unit is self contained with all electronics, powered by a single 12 Vdc source. It can withstand the rain and snow.

Figure-Automatic UAV Tracking System

(3)

road; mechanical connectors for connecting the cameras to the mobile platforms, that create an adjustable projection of the spatial distance between the camera and the mobile platform; and a control station for receiving and processing data from and transferring data the cameras.

The Camera offers images on demand and is an inexpensive alternative to satellite or flying an airplane over a field. It can provide images for military borders, agriculture, forestry, oil & gas, surveys, mapping, land management drainage, environmental, and a multitude of other uses.

It is highly efficient and user friendly for the commercial market. It is a radio control (RC) glider plane equipped with a digital camera, controlled by an autopilot, along with pre-programmed ground control software. Available in electric, the Camera will also work with a RC transmitter for manual control of the plane.

The Camera utilizes an autopilot for navigation and control of the camera, for everything else it is RC parts (wings, servos, propellers, glow fuel or batteries) purchased locally. This makes the Camera parts practical and accessible worldwide. Furthermore, the Camera lends itself easily to the establishment of a Dealer System due to ease of use, minimal training required, and locally available RC parts. Development of a suitable lightweight system in which a sensor is airborne for carrying out surveillance. The sensor should remain airborne for a minimum of 2 minutes at a minimum height of 30 meter and above to do imaging of a proportionate area below. Recognizable real time video information should be transmitted to the ground receiver point suitably located in the observation area. Sensor should be able to detect man-sized objects in above-mentioned conditions. Proposed solution should take up design of configuration and identification of suitable options for sensor, data link, ground observation & control points and other support system(s). System configuration details comprising of sensor, data link, observation, data processing mechanism and support system should form part of the design.

3.1 Surveillance Missions

• Mission operator enters targets of interest (TOIs) and target characteristics • May intercede during autonomous operations

• Autonomous system may ask for guidance

3.1.2 ARCHITECTURAL DESIGN

RA0/AN0 2 RA1/AN1 3 RA2/AN2/VREF-/CVREF 4 RA4/T0CKI/C1OUT 6 RA5/AN4/SS/C2OUT 7 RE0/AN5/RD 8 RE1/AN6/W R 9 RE2/AN7/CS 10 OSC1/CLKIN 13 OSC2/CLKOUT 14 RC1/T1OSI/CCP2 16 RC2/CCP1 17 RC3/SCK/SCL 18 RD0/PSP0 19 RD1/PSP1 20 RB7/PGD 40 RB6/PGCRB5RB4 393837 RB3/PGMRB0/INTRB2RB1 36353433

RD7/PSP7 30 RD6/PSP6 29 RD5/PSP5 28 RD4/PSP4 27 RD3/PSP3 22 RD2/PSP2 21 RC7/RX/DTRC6/TX/CKRC5/SDO 262524 RC4/SDI/SDA 23 RA3/AN3/VREF+ 5 RC0/T1OSO/T1CKI 15 MCLR/Vpp/THV 1 RA0/AN0 2 RA1/AN1 3 RA2/AN2/VREF-/CVREF 4 RA4/T0CKI/C1OUT 6 RA5/AN4/SS/C2OUT 7 RE0/AN5/RD 8 RE1/AN6/WR 9 RE2/AN7/CS 10 OSC1/CLKIN 13 OSC2/CLKOUT 14 RC1/T1OSI/CCP2 16 RC2/CCP1 17 RC3/SCK/SCL 18 RD0/PSP0 19 RD1/PSP1 20 RB7/PGD 40 RB6/PGC 39 RB5 38 RB4 37 RB3/PGM 36 RB2 35 RB1 34 RB0/INT 33 RD7/PSP7 30 RD6/PSP6 29 RD5/PSP5 28 RD4/PSP4 27 RD3/PSP3 22 RD2/PSP2 21 RC7/RX/DT 26 RC6/TX/CK 25 RC5/SDO 24 RC4/SDI/SDA 23 RA3/AN3/VREF+ 5 RC0/T1OSO/T1CKI 15 MCLR/Vpp/THV 1 T1IN 11 R1OUT 12 T2IN 10 R2OUT 9 T1OUT 14 R1IN 13 T2OUT 7 R2IN 8 C2+ 4 C2-5 C1+ 1 C1-3 VS+ 2 VS- 6 1 2 3 1 6 2 7 3 8 4 9 5 1B 1 1C 16

2B 2 2C 15

3B 3 3C 14

4B 4 4C 13

5B 5 5C 12

6B 6 6C 11

7B 7 7C 10

COM 9

GND

GND

GND

20 MHZ 20 MHZ

22pF 22pF 22pF 22pF 1uF 1uF 1uF 1uF +5V +12V U L N 2 0 0 3 M A X 2 3 2 MOBILE RS232 P IC 1 6 F 8 7 7 A P IC 1 6 F 8 7 7

A RE

L A Y S 1 N 4 0 0 7 1 N 4 0 0 7 1 N 4 0 0 7 1 N 4 0 0 7 1 N 4 0 0 7 +12V

Fig-3.1 system architecture

3.2.2 Module 1: Flight Control wireless Unit

(4)

Figure: module1-block diagram

The object detected through camera is given to microcontroller and the controller sends it to RF amplifier. The RF amplifier will amplify it and is transmitted through RF transmitter. The data which is received from transmitter is given to the microcontroller. The microcontroller gives data to motor driver circuits to control. The two motors are controlled by micro controller.

The movement of the flight is determined by the PC control unit. When ‘s’ is pressed from PC the signal is received through RF receiver and it begins propelling. It uses its motor to start. Batteries are used for the power supply. When ‘m’ is sent this increases its speed to medium level and to maximize the speed, ‘h’ is pressed. Left and right directions are also sensed from the key pressed in the PC unit. According to our program, when either 6,7,or 8 is pressed from mobile unit, it asks to check for human, vehicle or other objects respectively. This opens the corresponding ports. The PC control unit then opens the camera window. This instruction is received by the RF receiver.

This unit sends the moving pictures frame by frame to the PC unit. If there is any variation in the frames, PC senses it and sends the command to save that particular frame. The flight control unit then saves that particular frame and sends it to the PC. When ‘a’ is sent from PC, the relays gets switched off and thus the vehicle lands.is pressed from PC the signal is received through RF receiver and it begins propelling. It uses its motor to start. Batteries are used for the power supply. When ‘m’ is sent this increases its speed to medium level and to maximize the speed, ‘h’ is pressed. Left and right directions are also sensed from the key pressed in the PC unit. According to our program, when either 6,7,or 8 is pressed from mobile unit, it asks to check for human, vehicle or other objects respectively. This opens the corresponding ports. The PC control unit then opens the camera window. This instruction is received by the RF receiver.

This unit sends the moving pictures frame by frame to the PC unit. If there is any variation in the frames, PC senses it and sends the command to save that particular frame. The flight control unit then saves that particular frame and sends it to the PC. When ‘a’ is sent from PC, the relays gets switched off and thus the vehicle lands. 3.2.3 MODULE 2: PC Image processing Unit

Module2-block diagram

The PC will receive the data from video receiver circuit by 2.4GHZ frequency and the communication will happen through USB port in 2.0 versions. After receiving the video frames, it will Display through Real Player and read 35 frames per sec. Each frame is compared using Erison and Gray scale filters algorithms and with edge detect algorithm. After comparison the systems will fan-out if any human based object is detected or not. If detected it will give the alarm. The PC receives 1 from mobile unit and it converts this into a variable. It checks for the serial communication and if there is proper communication it sends ‘access’ message to the mobile unit. Now the mobile sends either 6,7 or 8 to the PIC for detecting human, vehicle or any other objects. If PIC receives 6, it is converted into variable ‘w’ and sends it to PC. This in turn sends it to the 2nd PIC where it is converted into a variable ‘a’. This opens the motion port 1 for human detection and also opens the application window and camera window. Similarly when PIC receives 7 or 8 it is converted into variable ‘x’ or ‘y’ respectively and sent to PC for the detection of vehicle or any other object. Pixel, height and width are allocated to identify whether it is a human, vehicle or any other object.

We must type ‘s’ from PC unit. This is used to start the flight in a slow speed. The corresponding relay for slow speed opens and this propels the vehicle. The vehicle will be in air. Press ‘m’ to increase the speed of the

PIC16F87 7A

Processor

A/V-RX 2.4 GHZ USB

ADC Stick u

RF Encoder Circuit

(5)

vehicle. The relay corresponding to medium speed opens and all the other relays are switched off. When speed is to be increased to its maximum, press ‘h’. The relay for high speed opens, closing all the other relays. For directions, ‘l’ or ‘r’ is pressed for left or right direction respectively. This opens the relay corresponding to left or right. The relay for high speed is also in the open state. When the work is completed, press ‘a’ to stop the motion of vehicle. This lands the vehicle by switching off all the relays.

4. Motion Detection:

There are many approaches for motion detection in a continuous video stream. All of them are based on comparing of the current video frame with one from the previous frames or with something that we'll call background.This application supports the following types of video sources:

 AVI files (using Video for Windows, interlope library is included);  updating JPEG from internet cameras;

 MJPEG (motion JPEG) streams from different internet cameras;

Algorithms

One of the most common approaches is to compare the current frame with the previous one. It's useful in video compression when you need to estimate changes and to write only the changes, not the whole frame. If the object is moving smoothly we'll receive small changes from frame to frame. So, it's impossible to get the whole moving object. Things become worse, when the object is moving so slowly, when the algorithms will not give any result at all.

There is another approach. It's possible to compare the current frame not with the previous one but with the first frame in the video sequence. So, if there were no objects in the initial frame, comparison of the current frame with the first one will give us the whole moving object independently of its motion speed.

5. Edge Detection

(1) Detect edges using the Sobel method. (2) Detect edges using the Laplace method.

Edges characterize boundaries and are therefore a problem of fundamental importance in image processing. Edges in images are areas with strong intensity contrasts – a jump in intensity from one pixel to the next. Edge detecting an image significantly reduces the amount of data and filters out useless information, while preserving the important structural properties in an image.

Edge detection methods may be grouped into two categories, gradient and Laplacian. The gradient method detects the edges by looking for the maximum and minimum in the first derivative of the image. The Laplacian method searches for zero crossings in the second derivative of the image to find edges.

Suppose we have the following signal, with an edge shown by the jump in intensity below:

(6)

Clearly, the derivative shows a maximum located at the center of the edge in the original signal. Another alternative to find the location of an edge is to locate the zeros in the second derivative. This method is known as the Laplacian and the second derivative of the signal is shown below:

5.1 SOBEL

The Sobel edge detector uses a pair of 3x3 convolution masks, one estimating the gradient in the x-direction (columns) and the other estimating the gradient in the y-x-direction (rows). A convolution mask is usually much smaller than the actual image. As a result, the mask is slid over the image, manipulating a square of pixels at a time. The actual Sobel masks are shown below:

Table- manipulating pixels in sobel method

The magnitude of the gradient is then calculated using the formula:

An approximate magnitude can be calculated using: |G| = |Gx| + |Gy|

6.0 GSM (Mobile Communication by FBUS Protocol)

This system will read the Human based object and communicate to mobile by SMS. Most Nokia phones have F-Bus and M-Bus connections that can be used to connect a phone to a PC or in our case a microcontroller. The connection can be used for controlling just about all functions of the phone, as well as uploading new firmware etc. This bus will allow us to send and receive SMS messages.

Fig Mobile Pin Structure

M-Bus is a one pin bi-directional bus for both transmitting and receiving data from the phone. It is slow (9600bps) and only half-duplex. Only two pins on the phone are used. One ground and one data. Whereas F-Bus is the later high-speed full-duplex bus. It uses one pin for transmitting data and one pin for receiving data plus the ground pin. Very much like a standard serial port. It is fast 115,200bps, 8 data bits, no parity, one stop bit. For F-Bus the data terminal ready (DTR) pin must be set and the request to send (RTS) pin cleared.

(7)

Sample frame sent to Nokia 3310 (shown as a Hex dump) Byte: 00 01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 Data: 1E 00 0C D1 00 07 00 01 00 03 00 01 60 00 72 D5

Working of F-Bus to send a msg:

 Let's say we want to decode the string 'hello'. First we have displayed 'hello' in hexadecimal using the character map provided in GSM 03.38. For A to Z and numbers its just the standard ASCII conversion.

 h e l l o (ASCII characters) 68 65 6C 6C 6F (In hexadecimal) 1101000 1100101 1101100 1101100 1101111 (In Binary)

 Shown below is the same string of 'hello' just displayed backwards. 6F 6C 6C 65 68

1101111 1101100 1101100 1100101 1101000 (The ASCII characters shown in binary) 110 11111101 10011011 00110010 11101000 (The above binary just split into 8 bit segments) 06 FD 9B 32 E8 (The 8 bit segments decoded into hex)

 The message hello is therefore E8 32 9B FD 06 when packed.

The first program allows to input the string to be packed in the top memo box and when the button 'Pack' is pressed the top string is packed.

This program below will both unpack and then pack the message. It was used to read the packed messages from serial protocol analyzer. It also tests subroutines in C for both pack and unpack.

In our paper the mobile number and the centre number are feeded into the F-Bus using the c program. To check if serial communication exists, 1 is sent from the mobile. If it exists, the mobile will get an ‘access’ message. Number 6,7 or 8 is sent from the mobile to the PIC for the detection of either human, vehicle or any other object. If any of these is detected, the mobile will get intimation from the PC.

7. CONCLUSION

Thus the paper maintains a stable hover using cheap embedding devices when used on an inexpensive small helicopter. System components include noise reduction, feature extraction, classification and decision-making. Decision-making is performed in terms of an alarm signal. The system is suitable for aerial video graph, thermal imagery, target designation, sensor placement or precision munitions delivery and can perform these functions at a small fraction of the cost.

8. FUTURE ENHANCEMENT

Future developers must strive to provide the enhanced capability systems at affordable prices. Technological improvements in UAV performance are already underway. Newer UAVs under development should have more autonomous control so that they need less pilot correction. This will include automatic collision avoidance. Improved mission control capabilities should allow multiple UAVs to fly in a cooperative groups and formations. Improved coordination of UAV flights with the flights of manned aerial vehicles, satellites, cruise missiles, and other UAVs would further enhance their utility.

(8)

Increasing UAV fuel efficiency or increasing fuel capacity will allow unmanned aerial vehicles to fly farther, conduct more complex missions, and loiter longer. It would also allow commanders to require fewer UAVs. Weaponized UAVs of the future might employ not only heavier and more advanced precision guided munitions but also directed energy weapons such as destructive laser beams.

Making UAVs faster would decrease their vulnerability to enemy fire and increase their ability to confirm the hostility of potential targets in time for manned aircraft strikes. Increasing UAV stealthiness and the variability of UAV flight paths would also make them harder for an enemy to knock down. Giving them more all weather capability by the addition of more effective deicing equipment would also improve their utility in the winter.

Increasing the quality of UAV sensors through miniaturization would reduce the need for U-2 and satellite reconnaissance. Real time video imagery could be enhanced. To meet the needs in these areas, UAV remote sensors should aim to be all in one, commercial off the shelf (COTS), integrated systems. UAV remote sensor manufacturers should no longer aim to produce "stand-alone" sensors, but rather "integrated sensor" systems that aim to link high-resolution visual, EO and RADAR together.

REFERENCES

[1] Adrian, A.M.; Norwood, S.H.; Mask, P.L. Producer's perceptions and attitudes towards precision agriculture technologies. Computers and

Electronics in Agriculture 2005, 48, 256-271.

[2] Beard, R.W.; McLain, T.W.; Nelson, D.B.; Kingston, D.B.; Johanson, D. Decentralized cooperative aerial-surveillance using fixed-wing miniature UAVs. Proceedings of the IEEE 2006.

[3] Kim, Y.; Reid, J.F. Modeling and calibration of a multispectral imaging sensor for in-field crop nitrogen assessment. Applied Engineering in Agriculture 2006, 22, 935-941. Sensors 2008, 8

[4] Murakami, E.; Saraiava, A.M.; Ribeiro, L.C.M.; Cugnasca, C.E.; Hirakawa, A.R.; Correa, P.L.P. An infrastructure for the development of distributed service-oriented information system for precision agriculture. Computers and Electronics in Agriculture 2007, 58, 37-48.

Author Profile:

Suresh Kumar is an Assistant Professor in Department of Information Technology at Saveetha Engineering College. He received his Master of Computer Application in Madurai Kamaraj University, in 2003 and his Master of Technology from Sathyabama University at 2007. Currently he started his research in mobile database. His area of interest includes Web Technology, Database Technology and Mobile Computing.

Referências

Documentos relacionados

Neste recorte de um projeto de pesquisa intitulado “Estudo Socioeconômico e Demográfico da População Idosa no Meio Rural do Município de Santa Cruz do Sul”, de caráter

Also, the momentum transfer, which occurs in the mid- dle region of the cross-section and is higher with decreasing b ( Fig. 17 ), is the result of the development of the second pair

Aos Profs. À Eng.ª Anabela Nave pela ajuda concedida na realização e planificação do trabalho de campo. Aos meus colegas de Direcção da EPAQL, por todo o apoio concedido.. Com a

O algoritmo com melhor desempenho para mapear usos da terra na região de estudo foi o ML, com valores de Kappa que indicam concordância quase perfeita entre os dados

É ainda de notar que os maiores picos de densidade observados durante todos os anos de amostragem foram essencialmente devidos a estes três grupos de zooplanctontes, e alguns

Nisbett hat diese materiellen Grenzen un- serer Denkweisen einmal in einem Schema beschrieben, 14 das uns auch dann noch – oder vielleicht gerade dann – zu gelten scheint, wenn

O enfoque adotado por Britto & Albuquerque (2001) para a análise de clusters é o da interde- pendência (conforme diferenciação entre cluster vertical e horizontal desses

As altas taxas de juros, sistema de garantias, a lógica bancária das aplicações e na destinação dos recursos, além da burocracia na concessão de empréstimos, são fatores