(the Burke-Gaffney and Abbey Ridge Observatories)

Author: Dave Lane (the Director at BGO and owner of ARO)
Published in Nova Notes - June 2016

In the past year or two I developed and brought online two robotic observatories that any of you can use from the comfort of your easy chair. Both are almost identical: the Burke-Gaffney Observatory (BGO) has a Planewave 24-inch telescope (nicknamed “Ralph”) in light-polluted Halifax skies and the Abbey Ridge Observatory (ARO) has a Celestron 14-inch telescope (nicknamed “Sam”) in fairly dark Stillwater Lake skies 23km west of Halifax.

The BGO recently completed a major renovation and telescope and equipment upgrade, funded principally by Dr. Ralph Medjuck. Part of that project involved automating the observatory along with a strong desire to make it much more productive (given the challenging weather in Nova Scotia) and to make it very easy to use by our students and others. I chose to use the Twitter social media platform as the human-machine interface, since many students and others are already using it. Twitter has the advantage of being accessible from any web browser and apps are available for all tablets and smart phones.

Once the software was stable in late 2015, it was ported to ARO which came on line in early January. There are some minor differences between the two observatories (eg. the filter options differ and the field of view of Sam is a bit smaller), but they run the same programs so behave the almost identically to each other. From now on in this article, when I refer to “it” or the “telescope” it could be either Ralph or Sam.

Overall, the project has been declared a success and it was recognized by Twitter as one of the 10 most innovated uses of Twitter in Canada. The Twitter interface was used expensively this past academic year by three astronomy classes. Observational projects undertaken ranged from simple images of deep sky objects to long term variability studies of an active galaxy and light curves of variable stars and extrasolar planets. It has also been used by high-school astronomy students and by many beginner and advanced amateur astronomers and astro-photographers in Canada and from many other countries.

How It Works

You interact with the telescope using a Twitter account to send messages to the telescope and it replies back – Twitter calls these messages “tweets”. As long as it understands what you “tweet” to it, it obeys your commands and provides you with personalized images of just about any type of astronomical object visible from Nova Scotia! In addition to being interactive, it also passively tweets what it is up to as it works. See the Twitter feeds at @smubgobs and @abbeyridgeobs.

It listens for your commands 24/7, but of course it can only do its imaging work at night when most of us humans sleep, so it queues your imaging “requests” and runs them from the queue later. There are times when it has nothing to do and you can get control of it right away (a good reason to stay up late at night).

Each night after it becomes somewhat dark, it uses its cloud sensor to monitor the sky. If the skies are clear, it powers up the scope and other equipment, opens the dome, and focuses the camera and sync’s the scope’s position. After it has become fully dark and until shortly after the beginning of morning twilight it works through its queued observation requests, pausing if it becomes cloudy and restarting when it clears. Dodging clouds is a necessary “game” that it must play!

If it wasn’t clear when it became dark, it waits patiently for it to clear. The cloud sensor allows it to take advantage of nearly any patch of clear sky that passes over it during the night. In the first six months of 2016, Ralph was able to get some observations done on 70 nights! 

After it has closed its dome at dawn, it then takes whatever calibration images are needed, processes the night’s images and posts them to the “completed” queue on its website. It then notifies the observers that they have images ready. The cycle then repeats for the next night.

This seemingly simple description is, in practice, rather complicated to reliably realize, and there are about 100,000 lines of program code that makes it all work!

How To Use It

The first step is to become authorized – see the first resource link below for how to do that. There is also detailed help information available there. After you are authorized, you are then considered an “observer” and can request images. You should start by sending a test message like:

@smubgobs #hello

If it received and understood your tweet, it will reply with:

#bgoreplies @davidjameslane Hi, I'm the Burke-Gaffney Observatory with its Ralph Medjuck telescope at Saint Mary's University!

In the simplest form, sending the following tweet will cause it to take a 3 minute exposure of the galaxy Messier 65.

@smubgobs #request object=M65

Which it replies with:

#bgoreplies @davidjameslane Sorry, I cannot observe M65 in the next 30 days!

Opps, my mistake. M65 is a spring object in Leo and cannot be observed now. The telescope validates every request to make sure that it is in its database and reasonably observable from Halifax in the next month.

If you want to check an object’s validity before actually requesting it, try this:

@smubgobs #lookupobject object=M33

It replies with

#bgoreplies @davidjameslane Fixed object M33 found at position RA=01:33:53.4 DEC=+30d39'04", and it can be observed now

Success! Now let’s actually request it, but perhaps with a longer exposure than the default and let’s improve the image quality by observing it when it is high in the sky and with a not too bright Moon:

@smubgobs #request object=M33 exposure=300 maxmoon=75 minalt=60

It replies with:

#bgoreplies @davidjameslane Object M33 is in my request queue as ID 1574 (exposure=300 seconds filter=LUM)

When the observation actually takes place depends not only the availability of clear weather. Every time it’s ready to make an observation, it scans through the queue to see which observations can take place at that moment. Then, it plays a “points” game involving a number of factors including the priority (our students and special projects get a higher priority), how far the telescope has to move from its present position, how long the object has been in the queue, and several other factors. The request with the highest point score is done, then it tweets:

#bgoreplies @davidjameslane I have taken your observation of M33 (ID 1574)! I'll tell you when it's ready in the morning

And in the morning after all images are processed, you will be notified. All this happens while most of us are sleeping! You are provided both the raw data (a FITS format file) and a jpeg image automatically processed, which usually shows the requested object nicely. See the sample 3-minute exposure image below of M33 taken on the night of June 30 M33.jpg for Twitter user @caerward.

Advanced Usage

A “default” observer can have up to 3 requests in the queue at a time and exposures up to 5 minutes are allowed. Observers can be granted special privileges that allow, for example, up to 20 requests in the queue at a time, longer exposures, multiple filters in the same request, the ability to add new objects to the database, and the ability to have observations automatically re-run on multiple nights.

Interesting Observations

I have found it both interesting and gratifying to see how the observers have used the service and interacted with the Twitter interface. To date Ralph has completed over 1,400 requests covering a wide variety of celestial objects including planets, the Moon, comets, asteroids, stars, nebulae, clusters, galaxies, and quasars.

A few things that I have found surprising:

  • How well the images come out for a wide variety of objects types, both bright and faint, using the automatic processing with only a few minutes of exposure time from our light polluted skies
  • How easy and flexible the interface is to use. This has meant very few “support” requests from the observers.
  • The imagination of the observer’s in choosing interesting targets.
  • How enthusiastic several observers have been in taking images through several filters and combining those images into colour images.
  • How many observers have experimented with shooting the same object through different filters, including narrowband H-alpha and Oxygen-III.

While there have been are many requests for common and bright objects such as M13 or the Orion Nebula, many other interesting and beautiful objects have been imaged – too many to mention here, but I will highlight a recent image taken by Sam for Centre member Andrew Frank. He requested an image of MakeMake, one of the few dwarf planets out in the Kuiper Belt - I had no idea how easy would be, but there it was on the image!

What Is Next

As the software running the system has been quite reliable, my focus in the coming months will be directed towards:

  • “live processing” so jpeg images are made available immediately
  • Posting images to twitter (rather than just directing observer’s to the website)
  • Allow for time-series observations (usually of variable stars)
  • Facebook support
  • Automatic colour image processing
  • Position offsets (eg. for creating mosaics)

In addition, I would like to forge relationships with teachers so it can be a resource for students, particularly those teaching astronomy courses (and there are quite a few of these locally).

Invitation

To close, you are invited to use this service. Enjoy, and be sure to share your results with others!

Resources

To learn about the both telescopes and to get authorized visit: for Ralph: http://www.ap.smu.ca/pr/bgo-useme or for Sam: http://www.abbeyridgeobservatory.ca/robot

Facebook Group: Robotic Imaging at BGO & ARO: https://www.facebook.com/groups/1695977117312497/

Observation Queues – the requested and completed observation queues: for Ralph: http://www.ap.smu.ca/pr/bgo-useme/queues or for Sam: http://www.abbeyridgeobservatory.ca/robot/queues

 

Go to top