August 27, 2003: the day when Mars reached its closest point to Earth in 60,000 years.
There was a lot of hype (and hoaxes) around this Mars opposition. I was also preparing for it by making a 12.5” F6 Newtonian. Still a Ph.D. student at the time, I got help from several friends to make this happen.
The Mars season was a success! With a modified webcam (I sourced a monochrome CCD chip from Italy to replace the color sensor in the webcam!), I captured some really good Mars images.
The success fueled a dream, a larger scope for planetary imaging. I started to plan out a 16” F7.2 Newtonian on a tracking Dobsonian mount. As an engineer student, nothing seems to be impossible at the time.
The design used a mixture of steel, aluminum, and carbon fiber. While the mirror was on order, I built a 6” finder scope first as a testing piece. It was completed in 2004:
By summer 2007, the main scope was taking shape, again with the help of friends, and it was quite impressive 😊
The first light happened on Mar 2008, I believe (I have lost a few pictures of that…), and I remember the view of the moon was incredible.
If you look carefully, you may see motors on the dob mount in the picture above. Building the tracking system for the scope took a long time and was not successful. At the same time, I started on the market for a tenure-track position, and started realizing how inadequate my CV was. The work on the scope slowed, and then stopped.
Many things had happened between 2008 and 2020… The scope spent most of this period in the darkness of a garage, as evidence of “I once had a dream.” I still had the dream, just not the time and energy to pursue it. In the meantime, I found a 16” F5.85 mirror to replace the F7.2 mirror to make the scope more practical.
In 2020, I got a call from Mars again and resumed my astrophotography journey after a 16 year break. With improved cameras and image processing tools, I was able to take better Mars images with 11” SCT.
But what I really wanted was to complete the 16” scope. So, I started working on it again. I bought an equatorial platform to allow tracking but was disappointed to find out the poor quality of my new F5.85 mirror.
After an 18 month wait, now I have another mirror, a 16” F5.25 made by Zambuto. I also modified the scope to be mounted on an equatorial mount. It looks great, but a bit too tall on a pier… A shorter pier (see the photo at the beginning) solved the problem so I can take down the scope by myself at night.
My spaceship is finally ready to go, and I have been enjoying the ride since. Should I dare to dream bigger?
(Click on the photos to see larger size, and check out more photos in Gallery)
A couple weeks ago, a team of WVU students and I traveled to Utah to compete in the University Rover Challenge (URC) for the first time. It has been 5-years since I was last at a robot competition. This time, we have a new group of passionate and talented students, which brought back a lot of memory and excitements. We ended up doing well for a first-time team, but that was not without struggles and some luck.
Going to a robot competition is to get out of ones’ normal life routine. In a short a few days, unexpected events are rapidly unfolding in front of everyone’s eyes, followed by rapid and intense problem solving by the team members. In this post, I will mention just a few of these surprises.
Imagine you are sending a rover to Mars for a science mission. Your rover needs to be in other people’s hands for transportation and payload integration. It has to survive the rocket launch, months of interplanetary travel, and the short but horrifying landing process. It may not end up in the exact location on Mars as you hoped. Once it’s there, there is only so much you can do about the rover, and things start to break as the rover moves from one place to another …
URC was a bit like that. As a good robot challenge should be, there are many elements of surprises. Some of these surprises are imposed by the physical world, like a real Mars mission, and some are exclusively for the first timers like us.
Our launch vehicle was a brown UPS truck. We packed everything in five wooden crates and a cardboard box, with almost 300kg of gear. After traveling on the Earth surface for three days the shipment arrived at Denver. A two-person team picked it up with a van and completed the remaining 7-hour journey.
Several parts broke during this trip, mostly 3D printed ones. Luckily, we brought backups. Our 3D printer also had a motor mount broken. A team member (Tyler) zip tied the motor to print a new part to fix the problem, practically creating a self-repairing 3D printer. To our surprise, all the steel bolts on the rover were heavily rusted, as if the UPS truck took a sea route.
Getting the robot ready for the first two missions (Equipment Servicing and Autonomy) on the first competition day took a long time. Some testing were pushed to after dark. At close to 11pm (1pm Easter time), things started to look really good with everything working. When powering down the system, an (unpowered) GPS cable fall into the electronics box and got close (but not quite touching) the power distribution board. After a small flash under one of the darkest night skies in the US, everything went quiet.
The night of excitement renewed after the incident and sleep was no longer important. Close inspection of the power board revealed that an inductor melted down. The inline fuse was still intact and there was no way to tell if the electronics downstream (e.g., computer) were still ok. Swapping out the power board with a backup piece took some careful deliberation and planning. Luckily everything worked and there were still over 2 hours left to sleep before we need to get on the road.
It was a small miracle that the robot worked for the Equipment Serving task without having a chance to do a full system testing after putting everything back together. We probably wouldn’t do much better than what we did without more in-depth understanding of the tasks, which could only be acquired through being there.
The Autonomy task was more … dramatic, for a lack of better word. The robot held its position (like the picture below) for almost the entire duration of the 30-minute mission. At the very last moment, it took off and reached its first waypoint. For us outside of the command station trailer, a motionless robot can trigger many emotions and speculations. For the members inside the trailer, they were in a frantic problem-solving mode. Clearly, the time went by at very different rates just a few meters apart.
What turned out to be happening was that the terrain map loaded on the rover was centered around the habitats of MDRS. For the actual URC competition, the organizers split the four different missions at three locations about 1km apart. The starting point of the autonomy mission was just outside of our prior map. Knowing it’s not on the map, the robot did not know what to do. It took the team members just a few minutes to diagnose the problem, and then many trials and errors to place a blank map in the right place so the robot can move. It worked! I have seen many “autonomous” robots made up its mind to not go anywhere during the competitions …, this was the first time that a robot changed the mind (with some human help, of course).
With a bit more time and experience, we were better prepared for the next two missions on the following days: Science, and Extreme Retrieval and Delivery. There was no shortage of surprises and issues, but the team (and the rover) held up well.
An adventure like the URC trip teaches us the meaning of real-world engineering. To know a system works, we need to put it through the test of truly new environments and unexpected situations, out of the control of the robot designers. The thought that we shipped a rover to one of the most uninhabitable deserts in the continental US thousands of kilometers away and still managed to make it work in all four missions is quite satisfying. Many other teams, especially international ones, had to cope with even harder constraints, like designing the rover to fit in airline carry-on cases.
When a new problem arises during a competition, and it almost certainly will, the problem needs to be understood and solved quickly, either by the robot itself or by team members. Luckily, robot designers and programmers are trained problem solvers, although their performance can be further improved with more systematic approaches. For autonomous robots? problem-solving is a much harder challenge and perhaps the greatest gap in the current robotics research.
Here is a group photo of the team along with our judge (second from the left), taken in front of a MDRS habitat after the Extreme Retrieval and Delivery mission.
We have been working on the topic of robotic precision pollination for a few years now and will continue down this path in the foreseeable future. With the word “precision”, we mean treating crops as individual plants, recognizing their individual differences and needs, like how we would interact with people. If our robots can touch and precisely maneuver small and delicate flowers, they could be used to take care of plants in many different ways.
But why does anyone want to take over bees’ pollination job with robots? No, we don’t, and I much rather seeing bees flying in and out of flowers. What we like to have is a plan-B in case there is not enough bees or other insects to support our food production. With a growing human population, the rate of bee colony loss, and the climate change, this could become a real threat. We also want to be able to pollinate flowers in places where bees either do not like or cannot survive, such as confined indoor spaces (e.g., greenhouses, growth chambers, vertical agriculture settings, on a different planet, etc.).
In our previous project, we designed BrambleBee to pollinate Bramble (i.e., blackberry and raspberry) flowers. BrambleBee looks like a jumble sized bumblebee with a big arm, but cannot fly. We did not want to mimic bee’s flying ability; instead, we learned from bee’s micro hair structures and motions (thanks to our entomology team led by Dr. Yong-Lak Park) and used a custom-designed robotic hand to brush the flowers for precision pollen transfer.
BrambleBee served as a proof of concept and it was fun to watch it work, but there are still many challenges. For example, each flower is unique and there are many complex situations for a robot pollinator to handle (e.g., tightly clustered flowers, occlusion, deformable objects, plant motion, etc.). How to covert an experimental robot system to an effective agriculture machine and be accepted by growers is another major challenge. These are the research topics we will tackle with our next robot, StickBug.
Wait,…, I should say “robots” because StickBug is not a single robot. It would be a multi-robot system with four agents (one mobile base and three two armed robots moving on a vertical lift).
We have a talented, motivated, and diverse team that includes horticulturists (Dr. Nicole Waterland and her students), human-systems experts (Dr. Boyi Hu and his students from the University of Florida), roboticists (Dr. Jason Gross and I, along with undergraduate and graduate students from #WVURobtics). This project will be open sourced, starting with sharing our proposal. If you have any suggestions on our approach or are interested in collaborating on the project, please feel free to contact us.