Animal Vegetable Robot
PROJECTS
I have been experimenting with placing networked video cameras in the outdoors since about 1998, mainly for viewing wildlife via the Internet at my field station. Our success in Internet bird watching led to the installation of the Moss-Cam in 2002, the first camera to collect time-lapse color and infrared images daily of a small patch of moss, a species that is particularly sensitive to moisture. This experiment successfully recorded images nearly every day until 2008. Our set-up included a wireless weather station coupled with weather-tight cameras and an image database system to store the noon-time images.
During this time, we explored other time-lapse applications for measuring and documenting plant growth and responses to environmental conditions. Unfortunately, networked cameras were expensive at that time and limited in their image resolution. So our lab team worked out a methodology to use an off-the-shelf Canon Pan-tilt-zoom networked surveillance camera that a server could control. The program-controlled camera collected hundreds of overlapping still images in multiple horizontal panoramas. Images were then sequentially stitched using Photoshop to merge into a single gigapixel-resolution panoramic scene.
Today I am using a Dahua 360 camera mounted on my roof now duplicates the functionality of our early work, both in the hardware and built-in software. Thirty times a second, a perfectly stitched 360-degree hemispherical image, 12 megapixels in resolution, is generated and recorded to a video server that can apply machine learning analysis to detect patterns. I currently am not auto-detecting anything in particular, but even my Olympus SLR camera now has ML detection for bird photography. My curiosity is leading me towards ways to automatically record birds in flight, interesting clouds and cool sunsets.
The following 24-hour movie was built from recording a single frame image every 15 minutes and saving them as an m4v movie. The movie plays back at 30 frames per second, effectively compressing one hour of the entire sky every second. The movie is digitally warped into an interactive panorama using javascript written by Matthew Petroff at Pannellum.org. Once you click the window, the video will play for 28 seconds and then automatically loop until you select pause. Use your mouse to click and drag to a new perspective at any time during the play or pause.
Below is a calendar of links to all the 24 hour time-lapse movies that have been recorded in this project to date. Clicking a linked file will open a video player that will allow full-screen viewing. I began these recordings on August 9, 2021. The movie links are manually added, so bear with me if there are any delays. The image geometry or projection is of a full (180’ by 360’) rectangular hemisphere. You are viewing the entire sky in a single image.