CV on video stream

CV on video stream

This short demonstration/video is about how to use one of my simple, yet powerful, computer vision scripts to detect a color, but in this case, the video is going to be coming from a wireless camera.

diagram.001

On the above diagram, you can see how is everything connected.

The QX10 has the capability to send stream of JPEGs to a device connected to the wireless network this camera creates when its turned on.

To access this stream you need to read and use the API supplied by Sony and its now always very clear… But I managed (after a few attempts) to read the stream using python and openCV, and then a quick modification to one of my scripts, you can find the code here.

IMG_0963

I did a small video doing a demo of this script. In this script the target is to find the color green and put a box around it.

The most important part (for me, anyhow) is to check the times of the transmission and the detection of the color. This is the latency… If you pause the video when I’m displaying the iPhone in both screens (one the real and the other one the transmitted image), then we can see that the latency of the stream is around 0.3-0.5 seconds, in average 0.4 seconds. Is not that bad actually.

Check the video, and leave your comments.

 

How to DroneKit SITL

How to DroneKit SITL

Software in the loop, is a software testing technique that will help you test your software using simulations before doing it with real/physical systems. ASAM defines a generic simulator interface as:

Definition of an API between test automation systems and test-benches such as HIL-systems (hardware-in-the-loop) or SIL-systems (software-in-the-loop). Provides access to the simulation model, ECU internal measurement and calibration data, diagnostics data, the electrical error simulation unit and the ECU network. API is described as a technology-independent UML model.

In the case of our beloved drones, we will use SIL techniques to crash them virtually before attempting to test our software on real expensive hardware.

Thanks to the developer team from 3DR, they have developed a full simulator that allows you to run APM Plane, Copter or Rover without any hardware. It is a build of the autopilot code using an ordinary C++ compiler, giving you a native executable that allows you to test the behaviour of the code without hardware.

The main issue is that sometimes it gets tricky to make it work… or to make it properly…

This small guide will help people who is having problems make it work (also for me when installing it on new systems :P)

Assumptions:

  • This guide is for Mac users (maybe I’ll make a post for windows…)
  • If you have a android tablet, it will work as well (as complement)
  • You know basic stuff on Mac (like using terminal, brew, pip)
  • Brew, pip and python installed

Having said that, lets start the process…

 

Installation.

 

We need basically 3 things, dronekit, dronekit-sitl and mavproxy. MAVProxy is a command-line UAV ground station software package for MAVLink based systems. MAVLink is the “language” that our beloved drones talk.

Steps:

  • Update brew:

brew update

  • Update your pip:

pip install --upgrade pip

  • Install Dronekit:

pip install dronekit

  • Install SITL:

pip install dronekit-sitl

  • Install MAVProxy (execute line by line:

brew tap homebrew/science

brew install wxmac wxpython opencv

pip uninstall python-dateutil

pip install numpy pyparsing

pip install MAVProxy

Usage.

 

The most simplistic usage for this set of tools, is to send commands to a simulated vehicle to take off and land… and see on the screen how is doing it.

To achieve this, we need to create a virtual vehicle and then connect MAVProxy to it, and make MAVProxy spread the information to several devices/applications.

Create a vehicle (on one terminal):

 

dronekit-sitl copter --home=55.870595,-4.287639,0,0

vehicle-sitl

Allow me to explain what I’m doing on this command:

  • copter -> SITL can do plane, copter, rover… I’m making this virtual vehicle a multicopter.
  • –home=55.870595,-4.287639,0,0 -> Making the home of the vehicle outside of the building I work.

 

Start MAVProxy (on a different terminal):

 

mavproxy.py --master=tcp:127.0.0.1:5760 --out=udpout:192.168.1.9:14550 --out=udpout:127.0.0.1:14550 --out=udpout:127.0.0.1:14549

mavproxy-starting

Allow me to explain what I’m doing on this long command:

  • –master=tcp:127.0.0.1:5760 -> I here we declare how to connect this mavproxy instance with the virtual vehicle (drone-kit-sitl creates a TCP connection with a specific port and waits until something connects… In our case, MAVproxy will connect to it.
  • –out=udpout:192.168.1.9:14550 -> I’m making mavproxy proxy all of the information is receiving from the virtual vehicle to a particular IP using UDP… this IP happens to be the one of my tablet. So, as long as my tablet is connected to ALTAX-network, it will be receiving the information of the virtual vehicle.
  • –out=udpout:127.0.0.1:14550 -> Same as above but using localhost, so… mavproxy will proxy the information to my mac using UDP… sounds kinda weird, but this is for using another application like “APM Planner” or “qgroundcontrol” and see the same information as the tablet will see.

 

Open applications:

 

Now, whats rest to do is open a application that will show us “graphically” how the vehicle is doing… just as if it was REAL!! so, go and open “qgroundcontrol” on the mac…

And on the tablet, open “Tower”, and change the telemetry connection type from “usb” to “udp”… thats it!

qgroundcontrol on the mac and tower on the tablet
qgroundcontrol on the mac and tower on the tablet

Take off!

 

Ok, let make the vehicle move or do something fun… execute this lines (as they are written, literally) on the terminal window that is running MAVProxy:

  • mode guided
  • arm throttle
  • takeoff 1000

With this commands you will make the vehicle start flying till it reaches 1000 meters (mental!!!) and then wait, or “loiter”, of course, the battery will start to run out, but no worries, is virtual!

qgroundcontrol - I took this screeshoot when the vehicle was at almost 400 meters
qgroundcontrol – I took this screeshoot when the vehicle was at almost 400 meters

 

Tower app reporting the vehicle was a 1km height.
Tower app reporting the vehicle was a 1km height.

 

If you want your vehicle to land, then just type: “mode land”

Congratulations, you just made a virtual vehicle take off, fly to 1000 meters and then hit ground at 0.509052 m/s. Go have a beer, you earn it.

 

Flight Stack

Flight Stack

I’m currently using two (or more…) computers onboard my drones, one computer is in charge of stabilisation and control of the vehicle (g.n.C) while the other handles the guidance and navigation part (G.N.c), in this post I’ll define and will give a short explanation of my flight stack.

G.N.C. is is a branch of engineering dealing with the design of systems to control the movement of vehicles, especially, automobiles, ships, aircraft, and spacecraft. At this moment my main focus is aircraft, and in more particular sense, multirotors. In many cases these functions can be performed by trained humans, pilots… however, because of the speed and complexity of flight dynamics of multirotors, human reaction time is too slow to control this movement, or even impossible to control 4 actuators in order to make a quadcopter fly. Therefore, systems are used for such control. Even in cases where humans can perform these functions, it is often the case that GNC systems provide benefits such as alleviating operator work load, smoothing turbulence, energy savings, etc. In addition, sophisticated applications of GNC enable automatic or remote control.

DSC00917
TestQuad showing Optical Markers, GPS, Altax Flight Stack, RPI camera and LED red landing lights.

Flight Controllers

 

To handle the stabilisation and control of the vehicle I use off-the-shelf flight controllers, and I use several of them Multiwii (several models), APM, PX4, Pixhawk, CC3D among others. The ones I tend to use harder are the Multiwii boards and the Pixhawk.

MultiWii

 

DSC01564
Altax Flight Stack with a Naze32

You might be asking yourself which multiwii board I use?? and the answer is simple, whichever. I wrote a python module that talks the multiwii serial protocol (pyMultiwii) and thats how I communicate with that flight controller. So, any flight controller board that uses the multiwii firmware or that it equipped with MSP (multiwii serial protocol) is going to work with my module. I have tested my module on the next multiwii boards:

  • MultiWii AOI v1 and v2
  • MultiWii lite v1
  • MultiWii SE v2
  • NanoWii
  • MultiWii 328p
  • Afro Naze32
  • Naze32 (complete)

The new trend among flight controllers is to use 32bit microprocessors (usually at 72mhz…), and the new MultiWii’s are 32bit versions with similar sensors as the old ones, of course the firmware is different, and the vehicle attitude control is much more smoother in comparison with the 8bit boards, due to faster processing and the use of better precision (float or double) on the calculations. A very noticeable performance occurred after using the new board version, the position controllers performed 70% better after having a faster inner loop (attitude stabilisation).

And between those new flight controllers, there is two “similar” firmwares, baseflight and cleanflight. I will not go into detail about this situation. Currently I’m using baseflight, although I have reports from other people that my module also works with cleanflight.

Pixhawk

 

DSC01777
Altax Flight Stack with a Pixhawk Lite

The pixhawk is a flight controller designed by ETH Zurich which started with the development of the PX4 autopilot and then to its new form know as pixhawk, it supports several flight stacks, the most popular/most used is the 3DRobotics (APM Flight Stack), which is right now the second largest consumer drone company in the world. The other flight stack is the “PX4 Flight Stack”.

On the hardware side, it uses a 32 bit processor clocking at 168mhz and two IMU’s instead of just one to provide redundancy to the system. I will not go to much into detail about this.

Being an open source open hardware project, several “versions” of the pixhawk have already appeared on the market, also several chinese clones… I have tested several versions of the pixhawk, among this ones:

  • PX4 (original)
  • 3DR Pixhawk
  • Pixhawk (HobbyKing clone)
  • Pixhawk Lite

Although the chinese clones are far more cheap to get than an original version, I personally recommend to use the original ones, this ones have being more tested and go through more strict quality control methods and therefore must likely less to fail. And when you’re flying over the Atlantic… you want something that will NOT fail.

There are several alternatives to talk to this flight controller. And either the PX4 and APM flight stack, both talk “mavlink” which is a protocol for communicating with small unmanned vehicle. It is designed as a header-only message marshaling library. Its being adopted by several other flight controllers, its was done by Lorenz Meier… and you guess correctly, from ETH Zurich.

As I was saying, there is two ways of talking to the FC, one is using the 3DR option called “DroneKit”, I have use it to make several examples, it has its limitations and its practicalities.

The other one is to use ROS.

If I have enough time I’ll make a post describing both options.

Companion Computers

 

So, after introducing the flight controllers (the ones that are in charge of the “inner loop” or the stability of the aircraft), I’ll proceed to discuss a bit about the companion computers.

This depends entirely of the target mission you want to accomplish, in my case, I’m developing position controllers for multirotors… I have to mention that we can actually just use the FC to develop position controllers… but is a bit more complicated to modify tons of code on the firmware of either naze’s of the PX4/APM flight stack rather than separating the “outer loop” to be executed on another computer. That’s where the companion computers come into action. Also is better if you separate computers so that you can have a bit more of robustness, if the companion computer crash, the FC is still able to keep stabilising the vehicle and (with properly tuned failsafes) probably land the vehicle or disengage it.

There is a boom of linux credit-card size computers nowadays that we can make use as companion computers. Personally I like to use Raspberry Pi’s, due to its simplicity, robustness and low price. And now with the new Pi Zero, the things get very very small and light.

Companion computers I have used on my vehicles:

  • Raspberry Pi B+
  • Raspberry Pi 2
  • Odroid U3

I have had accidents with the RPI’s and those small computers are really sturdy. The RPI 2 is great when you want to run some computer vision algorithms and then command the vehicle to do certain commands. But having a RPI B+ (if you have one laying around) it work great.

Right now I have two exact same vehicles with naze32’s and RPI B+ doing position controls, and they work quite alright.

21599463685_034227c538_k
330 quad with Altax Flight Stack being controlled using motion capture data.

 

Flight Stack

 

So, whats the big deal having another computer on board? well… you have more computer power, you can do more complex stuff or you can interconnect more devices to the vehicle or even turn the vehicle into a IoT device.

In my most simple case I used it to get readings from the ground station which can contain, motion capture data, joystick commands or even calculated control commands.

Connection diagram
Connection diagram

In the case of the naze32 I just simple connect a usb short cable to the rpi and to the board. In the case of the pixhawk there is two ways of connect it, one using the rpi onboard serial interface or to use a FTDI cable/board to make them talk together. I have tested almost all connections possible.

I designed and 3D printed a small base that is bolted to the rpi and holds either the naze32 or a pixhawk. It makes everything tidier and works alright. I’ll upload it to Thingiverse and share it as well.

 

Wind Tunnel

Wind Tunnel

I had the opportunity to perform some flights inside the University of Glasgow Wind Tunnel Facilities, also called “The De Havilland Tunnel”, which has a test section of 9ft × 7ft (2.7m × 2.1m) and can do maximum speeds of 70m/s (156mph).

DSC01706
HexRacer and airplane scale model inside the Wind Tunnel at the University of Glasgow

This video (if its good enough…) it will be use for promotional material around the University of Glasgow.

I chose my new hexacopter racer to do this work, and I just outfitted a video transmitter and a mobius camera. My idea was to perform the flights doing FPV, but due to the “tight” spaces between my vehicle, the tunnel and the model installed (metallic scale model of an airplane) it was too risky to do it in that way, so I had to get inside the tunnel and perform the flights using my eyes, this also proved to be complicated, but at the end I managed to get some footage that will be analysed by media team to see if the can insert it somewhere on the promotional videos…

Something to consider is that if you are in a tunnel, close to the ground, you will suffer from ground effect and will cause instabilities… while being to close to the ceiling, it happens something similar, the rotors have more trouble generating the lift to keep the vehicle level and in the same altitude, and also causes some nasty behaviors on the vehicle.

ground effect

The only thing that did not happen was crashes… I tried to be extra careful to avoid hitting the tunnel and more particular, the model.

I produce some great gifs:

Drone flyby airplane

And a great video:

Finding Nessie

Finding Nessie

A few months ago I travelled around the highlands with my trusty hexacopter and like any normal person excited about Scotland, I tried to find the most famous cryptid in the world. Nessie.

A cryptid is an animal or plant whose existence has been suggested but has not been discovered or documented by the scientific community.

The Loch Ness Monster

Loch Ness is the second largest lake in Scotland, it’s also very deep, reaching 230 meters on its deepest point. I would definitely like to dive there, maybe in another trip…

Anyhow, I used my hexacopter to try to find Nessie, needless to say, I was unsuccessful… or maybe not??? watch the video and you might be surprised.

I flew with my 10,000 mah battery, which gave me about 20 minutes of flight time,  I performed automatic missions, but mainly it was manual flying, it was a bit risky, because I was flying over water, any failure and I’ll had to swim to recover it… But not as risky as this one. The vehicle performed flawlessly.

20467819786_b33f560ccc_o

20494003865_3da1138a8d_o

 

Check the video!!!

Flying from computer

Flying from computer

As a by-product of a research I’m working on right now, I’m showing this videos of me controlling a multirotor from a computer. It’s done in different platforms -two flight controllers- Pixhawk and MultiWii.

The companion computer used in this tests is a credit card-size computer, the popular Raspberry Pi 2, which has four cores clocks at 900mhz and 1 gb of RAM.

Connection diagram
Connection diagram

So, the basic idea is to send data to the companion computer and this will use it to several purposes. The purpose shown in this post is to send roll, pitch, throttle and yaw data to the flight controller. In other words, it is to fly the multicopter from another computer. Eliminating the need of a standard RC radio, you might ask, why would I want to this?? there is three ways to answer this, one is because I can… the second is to ensure that the communication with the vehicle is working great and is robust, and the third one is to further develop advanced outer-loop position controllers.

This is by no means new or cutting edge, it’s just clever use of the libraries I have written (in the case of my multiwii library…) and the one that has been developed by people from 3DR (dronekit).

The companion computer is running linux and some python scripts (I’ll explain more later…), the ground station is running Matlab/Simulink in order to read the joystick and motion capture tracking system (not used in this examples…) and then just sent via UDP to the raspberry pi.

Test vehicles
Test vehicles

We are currently testing on 3 vehicles, all of them quadrotors, one is a bit bigger than the others… 650mm from rotor to rotor in comparison to 330mm for the smaller ones.

Pixhawk

 

Firstly, I would like to make a recommendation related to this flight controller, don’t buy the hobby king clone version. We wanted to “save” money by buying this clone version and we ended up with 2 controllers that are not working… we might just had bad luck with a bad batch but who knows… Buy the original one.

The pixhawk is connected to the raspberry pi using serial communication, a tx/rx/gnd cable, we are using serial port 2 on the pixhawk.

The baudrate on the rpi is limited to 115,200… But for our requirements it’s still working ok. Whenever we need more speed, we will need to modify the kernel or change from debian to ubuntu…

The rest of the multirotor remains the same. GPS is connected, but being inside a building, there is no way to get GPS lock. But no worries, we have a very precise indoors GPS system… a motion capture system. In the pictures shown above you can see a GPS stand being used for the optitrack markers, the overall tidiness of the vehicle is truly amazing.

The firmware remains untouched, we are using 3.2 and 3DR flight stack, we might change to PX4 flight stack, but so far this one is working ok. The only thing that we have changed on the pixhawk is the set of parameters of the SR2 rates. Apparently the maximum value we can assign is 10hz… we might need more.

In this one we use mavproxy.py and droneapi (renamed DroneKit) in order to send the channel override and fly the vehicle, the code looks something like this:

code

Droneapi makes doing apps/scripts and particularly talking to the pixhawk a very easy task. I want to thank Andrew Tridgell and all the developers involved here.

The resulting product is shown in this video:

MultiWii

 

I have done this one in the past, in several forms and with different boards, but in this test I’m using a naze32 board, which is great because it’s 32bit, which means it’s faster and the stabilisation is smoother, and, of course, the communication is way way faster than with my old 8bit versions.

I have actually achieved 300hz of communication using a oDroid U3… Check this video.

This library performs great, is lightweight and extremely easy to use.

The UDP reader thread was also written by me using twisted.

The code for this one (most important part) looks like this:

Which code do you think is easier??

The video is next:

Dunnet Head Drone

Dunnet Head Drone

This great Scottish summer, I did a small road-trip to the highlands with a few friends and my trusty hexacopter.

We did a stop in a place called Dunnet Head, which is a peninsula in Caithness, on the north coast of Scotland, that includes the most northerly point of the mainland of Great Britain. Its has an awesome cliff which is only possible to see if you’re in a boat or using a drone 😉

It was close to 11 am, I started to get my vehicle ready, the weather was almost perfect, sun, no rain and just 17 kph of wind speed (the highest wind speed I have tried is 22 kph…). But of course… I was about to fly above the Atlantic Ocean!!! any wind speed is undesirable.

IMG_3181
17.1 kph wind speed

This mission is without doubt the most risky one I have ever done, the problem involved here is that if something failed, a prop, GPS, a ESC, whatever… my vehicle was not going to just crash, it was going to be lost FOREVER. There is no way to get down that cliff (with no equipment) and by the time I could reach it in a boat, maybe it was 20-30 meters underwater, so, if there is fail, is game over.

Then why do I do this? you must think… Is just because I like to test my vehicles to the maximum and I trust them. Oh, by the way, I build this one, is not just a 1000 usd DJI Phathom thing… Its a vehicle I have dedicated hours and hours of designing, building and testing, this was the vehicle’s ultimate test.

Hexacopter ready to fly (props folded)
Hexacopter ready to fly (props folded)

The vehicle is almost everything in carbon fibre, and you can see the specifications here. A interesting fact about this one is that is using foldable props of 12 in, the usual size for this type of propeller is 15in.

So, I had my crew (The gang) to help me doing crowd control and warn me of some danger that me as a pilot cannot see easily. Thanks Tania, Dalila and Rob you guys rock!

I did a, what I thought, it was going to be a small automatic mission, because is easier for the vehicle to do a nice flight that me being at 300 meters away with no FPV gear. The mission was approximate 900 meters in total length and it was going to be done in 5 min.

Mission analysis

After do pre-flight tests, and check connections and the vehicle overall, the take off was very sweet and controlled. Loiter was holding great against the wind. I switch to stabilise and started to fly myself outside of the safe area (away from the cliff’s land) this was in order to take care of the possible big wind gusts in the edge of the cliff. Everything went great. Started to do some loitering to take nice videos, 360s and tilted the gimbal to see the danger under the vehicle (don’t forget to check the video).

IMG_0440
Going on automatic mode

The automatic mission started, I was checking how was performing with the telemetry link to my tablet, first WP’s went great, it started to turn towards Orkney and then… I heard my tablet saying “Mode Land”!!!

Of course that mode takes time and goes down very slow, and the pilot can override it by switching modes, at that moment I knew the mission was over and the next thing to do is get it back safely. I switched modes to stabilise and by checking the heading on my tablet to make it point towards me, I did full pitch forward… The vehicle was 363 meters away from me, I could barely see it, my tablet was super helpful. At the end I was able to bring it back safely and I actually continued flying until the battery was almost dry.

The logs showed that there was a EKF variance (Extended Kalman Filter) which is being used as the primary source for attitude and position estimates. This check will trigger when the EKF’s compass and velocity “variance” are higher than an specific value, and this is what happened. The vehicle was flying in a mode that requires GPS (Auto) so the vehicle will switch to “pilot controlled” LAND… not very cool in this kind of situations.

cartodb1
Actual GPS log from the entire flight
photo_2015-08-15_21-38-17
EKF variance
IMG_4423
My friends after the successful flight

The vehicle performed excellently (except for that variance…) and I was able to get very nice footage to show it to you guys, my readers. So, please check the next video and enjoy:

pyMultiWii

pyMultiWii

MultiWii is a UAV autopilot system developed by RC hobbyists. This project uses an Arduino board as a main processor while the sensor system can vary. This project aims to make the fabrication of electronics easy. It uses gyroscopes and accelerometers of the commercial off-the-shelf Wii motion controller from Nintendo (at the early beginning), which needs less soldering. They define it as “MultiWii is a general purpose software to control a multirotor RC model.”

I have used it for several years now, and its great for entry level to this exciting world of vehicles that crash beautifully, I also use it inside my laboratory.

In my lab I develop the outer loop controller, like a hover controller for example, and the MultiWii acts as the inner loop controller… reads sensors, generate PWM signals among other things, and more importantly it tracks the attitude angle I command.

Of course to make the MW listens to my commands, I need to speak “its language”, and that one is the MultiWii serial protocol, also know as MSP.

MWSP

I’m currently using a Raspberry Pi or similar as a companion computer onboard my vehicles, this computer is the one that will calculate and process the commands of the outer loop (hover controller eg,).

My companion computers therefore need to talk MSP… and precisely that is the purpose of this library.

I developed this library after spending too much time trying to use someones else code, I did it to be easy to use and easy to change. My definition is as follows:

pyMultiWii handles the MultiWii Serial Protocol to send/receive data from boards.

Lets make a very easy example… Imagine that you need to get the vehicles attitude at 10 hz, this is how to do it:

[code lang=”python”]
serialPort = "/dev/tty.usbserial-A101CCVF"
board1 = MultiWii(serialPort)
while True:
print board1.getData(MultiWii.ATTITUDE)
time.sleep(0.1)
[/code]

Easy right??

You just need to create a MultiWii object with the proper serial port to address to, then just create a while loop (to cycle forever) and send a request of attitude to the board… and of course then “sleep” for 0.1 seconds in order to do it at 10 hertz.

Wait, its not limited to 10 hz!!! you can put the rate you want… The fastest I have tried with a naze32 board and an Odroid U3 companion computer has being something above 300 hertz!!!

I have successfully used my library for commanding vehicles in my lab, the multiwii boards tested have being:

  • MultiWii AIO v2
  • Naze32 (either baseflight or cleanflight)

This library has being used by several people around the world to create their own cool UAV projects (I’ve received multiple emails thanking me for this piece of code)…

I encourage people to use it, and let me know if they found mistakes or ways to make it better, its licensed as GPL and the only think I ask when using it, is that people cite me or the library.

Here are some nice videos of the library in action:

 

I sent stuff to space

I sent stuff to space

A few months ago I was approach by some very-nice co-PhDs doing one of those cool balloons that fly very very high for some purpose, then burst it and open an small parachute for trying to soft-land the payload.

I was interested of course because is not very common to be able to say “I have sent some stuff to space”, I definitely wanted to have the possibility of saying that.

The project was titled:

 

Investigating stratosphere micrometeorite contamination

Abstract:

Meteorites and micrometeorites are important “peek-holes” to the formation of our solar system and processes on other planetary bodies. The field of meteoritics have received a lot of attention due to controversies around signatures of “life” in Martian and other meteorites, and contamination is a big problem when studying meteorites. In this study, carried out as part of the global space balloon challenge (GSBC), a Kaymont 1200 weather balloon equipped with an experiment box was used to transport micrometeorite analogues up to 33000 m. The experiment box opened between 25000 to 30000 m exposing the samples to potential microbial contamination. It is hoped this work may help provide important insights on the significance of contamination of micrometeorites in this level of the stratosphere. The experiment box was successfully retrieved, and scanning electron microscope analysis has shown probable upper atmosphere contamination consisting of NaCl salt, calcium carbonate grains and sulphur-bearing dust grains, which are possibly volcanic aerosols. Culturing of samples in the laboratory is ongoing, to assess the degree of biological contamination and to see which type of sample, if any, that the potential microbes preferred to hatch on to.

 

So, they asked me if I could open a hatch at certain altitude and then close it. This was to expose the samples to the atmosphere at that altitude, close it and thats it.

I started to work… My idea was to use one of my flight controllers (an multiwii one) that has a precision barometer and can certainly drive servos and just write a small snip of code to activate the servo and close it.

Scotland from above
Scotland from above

After reviewing the multiwii code and the mission specifications, I decided to use the CAMTRIG function in the multiwii code and just add an “if” that will monitor the altitude and if it was above 25,000 meters then open the hatch and “lift a flag”, if the altitude value was more than 30,000 meters then close the hatch. The flag or lock was just to prevent the hatch to open when the ballon was plummeting down to the center of the planet.

I had to make sure the barometer was going to work at that altitude and also if the multiwii was designed for such big integer… The altitude is an 32-bit integer and have a range of -2,147,483,648 to 2,147,483,647 cm, so, no problem there.

I did a test of this using smaller altitudes, from 50cm to 150cm and it worked flawlessly, I’m surprised of the precision of the barometer and how well all the multiwii code works!

So, I gave to the team my FC, a BEC to power it the controller and the servo and a small battery. Everyone else did their awesome job and the balloon was launched, everything worked great, the balloon landed and payload/cargo is safe. We have top men working on it right now. Who? (you might ask…) TOP…. MEN.

 

And now I can proudly say:

 

I have sent some stuff to space!

I have sent an MultiWii to space!

Part of me has being in space!

… haha someone stop me.

The code looked like this:

[code language=”cpp” title=”MultiWii code to open a hatch”]
#if defined(CAMTRIG)
  #define OPEN_HATCH  2000  // µs
  #define CLOSE_HATCH 1000
  //Altitudes
  #define OPEN_AT_ALT 250000 // cm
  #define CLOSE_AT_ALT 300000 // cm
 
  // If altitude is between the two values, open the hatch
  if(alt.EstAlt > OPEN_AT_ALT && alt.EstAlt < CLOSE_AT_ALT && !Descendlock){    
servo[2] = OPEN_HATCH;   }  
// Keep servo close in any other case  
else{  
servo[2] = CLOSE_HATCH;
if(alt.EstAlt > CLOSE_AT_ALT) Descendlock=true;
}
#endif
[/code]

UAS Grand Challenge

UAS Grand Challenge

University of Glasgow MAST Lab entry to the iMechE Unmanned Aerial Systems Grand Challenge 2014/2015.

The Unmanned Aircraft Systems Challenge (UAS Challenge) bridges the gap between academia and industry in developing applied UAS-related activities.

We designed an H quadrotor that was capable to carry 1 kilogram of flour, do waypoint navigation and drop the payload in a designated area.

This particular vehicle was done using 3D printed parts, and carbon fibre rods, the video explains more about it.

We used Pixhawk as flight controller and a Raspberry Pi 2 as companion computer. Several scripts in python were done to send commands via serial port to the Pixhawk, Drone API was used on the RPI 2.

The vehicle flew perfectly in manual and in autonomous mode, several tests were performed. The rpi2 / Pixhawk combo is a great way to do UAS applications like this one, and this is a living proof of it.

We had 3 ways of communicating with the vehicle:

– Standard RC

– 3DR radios (900 mhz)

– 2.4 ghz Wifi high gain antennas using SSH to the RPI 2 and then mavproxy.py to the Pixhawk

Important to notice that we also had an RTSP video server using the RPI camera with 0.5 seconds delay, HD and 30 fps… It worked great.

Our payload mechanism was extremely simple yet very practical and useful, 3D printed as well and fully tested, the video will show that 😉

Tools like the SITL simulator was really helpful to test our scripts before doing them on the actual vehicle, special thanks to the developers that are making this tools easily accesible to us.

You might be thinking about how well we did… The vehicle worked great, we passed scrutiny very easily and it was airworthy after just adding some padding for the LiPo’s. The problem was later, a manual test was needed before doing the automatic mission, and it must be flown by a certified pilot…

disaster
Moment of disaster

Currently in the MAST Lab we don’t have a certified pilot, so iMechE provided the certified pilot for our vehicle, and as many of you know, every vehicle is different… and being a prototype, even more. The problem is that he did not had chance to practice before doing the actual flight… the vehicle took off, and like at 40 centimeters it pitched super aggressively and the tip of the vehicle touched the ground and it flip, that was it.

That particular part was not in our spare list and our 3D printer was 500 miles away in Glasgow (ergo the last song in the video, duhhhh). Oh, by the way, we have a Makerbot Replicator 2.