No.3 - Getting Started in EAA, Part II

In the last issue, I mentioned that I would take a look at what typical equipment choices to consider when starting out in EAA. The basic hardware components of an EAA set-up are: a camera to capture images, some sort of telescope to gather light,  a mount that can track a celestial target over time while capturing an image, and a computer or other digital device to create and display an EAA image using EAA software. These are also the basic components of an astrophotography rig, although  AP software differs from EAA software. Often many EAA beginners start with a telescope on a mount that they might have used for visual astronomy, and ask the question, "What camera should I begin EAA with?", or "Does my mount track well enough to do EAA?". Those are still important questions for a beginner trying to cobble together an EAA set-up, and I'll give my thoughts in due course. 

Over the last couple of years the advent of all-in-one robot or smart scopes has given the beginner that does not want to cobble together an EAA rig the alternative of purchasing an integrated package. These new robot or smart scopes integrate the mount, telescope and camera  into a single device that is ready to automatically capture images and display them for EAA, or save them for later processing if doing astrophotography. The most popular all-in-one smart scopes have been small refractors on alt-az mounts using a small sensor cameras, and with proprietary software seamlessly controlling all components of the device, so that the user need only enter a few commands on a  smartphone or tablet or computer to control the smart scope. Smart scopes are evolving rapidly and I'll focus on their use specifically for EAA in a later blog.

In addition to the above equipment, many people doing EAA will consider a reducer lens  or corrector reducer lens to lower the effective f/ratio of a scope, and "speed up" image acquisition. Another important tool, mentioned in the last issue, is the use of filters to battle light pollution, and enhance the visibility of certain targets. 

 

Cameras

Most amateurs doing EAA have migrated to cameras using CMOS sensors from CCDs over the last decade, often with sensors manufactured by Sony for other low level light applications, such as security cameras.  (Sony discontinued production of CCDs in 2017).  As the amateur astronomy market accounts for a small fraction of the commercial market for Sony sensors, most CMOS sensors used by amateur astronomers were developed for other applications. Fortunately, there are many commercially available sensors - both  monochrome and color - that have been incorporated into astronomy cameras. 

The choice of whether to start EAA with a color camera or monochrome camera depends on preferences: If you want the most sensitive cameras (in terms of picking up the faintest details), a monochrome camera may be a better choice than color. If, on the other hand, you want to see color, in your EAA live stacked images, for example in the summer emission nebulae, then a color camera may be a better initial choice. I started with a very small sensor, uncooled one-shot-camera, the ZWO ASI224MC, many years ago, but opted to move to monochrome for the greater sensitivity. Most people do prefer color though because of the extra dimension of information that color can convey, plus the views of some targets are just prettier in color.

After deciding on whether to start with a color or monochrome camera, the next question a beginning EAAer often has is, "How big a sensor do I need?". The answer to that question depends on a couple of other considerations - In particular how big (in terms of angular size) are the targets you think you want to view via EAA, and what focal length telescope are you going to use. The key idea is that you want the target to fit within the field of view (FOV) of the camera sensor when used with the telescope you have chosen: If the target is very large, and the telescope has a long focal length, you'll need a large sensor to cover it. Conversely, a target like a small galaxy may easily fit into the FOV of a small sensor, even with a moderate focal length telescope. For example, let's suppose you have a camera with a small IMX290 monochrome sensor, as I do, and you want to use it with your venerable  Celestron 8inch f/10 Schmidt Cassegrain  to capture a large target like M42, the Great Nebula in Orion. Using the C8 at f/10, is that camera sensor big enough to cover M42, which spans around 1 degree in both RA and Declination? Let's do the calculation to see if M42 will fit into the FOV:

For a sensor that is W pixels in width and H pixels in height, with an image scale of I arcsecs/pixel, the field of view is: 

(W x I )/60 arcmins in width, and (H x I) / 60 arcmins in height 


where the image scale, I,  is given by the formula:

I  (arscsec/pixel)  =    206.265 x pixel size ( microns) / telescope focal length (mm) 




Player One Apollo-M Mini
For the C8 at f/10, the telescope focal length is about 2000mm, while the IMX290 has a pixel size of 2.9 microns in array of 1936 x 1096 pixels. Plugging these numbers in the above formulae gives an image scale of 0.3 arcsec/ pixel, and a FOV of 9.7 arcminutes x 5.5 arcminutes .... much too small to fit M42 onto the sensor. Even using a larger sensor, like my Player One Apollo-M Mini at left, and reducing the C8 to 1250mm focal length with a f/6.3 reducer, M42 is still too large to into the FOV. 




 A little bit of arithmetic shows that if want to squeeze a target covering roughly 1 degree  in both directions  onto the tiny IMX290 sensor, we can do so only with a telescope that has a focal length of  no more than 182mm, i.e. a very short focal length scope. Alternatively, we could combine the C8 with a f/6.3 reducer, yielding approximately 1250mm focal length, and a full size format sensor like the   IMX410 sensor which covers 36mm x 24mm. The IMX410 has a pixel size of 5.94 microns in array of  6072 x 4042 pixels. The above formulae give an image scale of  0.98 arcsec/ pixel, and a FOV of 99.2 arcmins x 66.0 arcmins, which would cover the FOV of M42 and produce a well sampled image scale. Given the cost of cameras with full format sensors, like the IMX410, many doing EAA will opt for a smaller sensor with a shorter focal length scope to accommodate targets that require a large FOV.

You'll notice that we haven't discussed the relevance of the image scale when determining what camera to use. Image scale is important as it puts a constraint on how much potential detail a given set-up will be able to resolve. For example, if you typically have 2-3 arcsecs seeing on an average night, then using  Nyquist sampling theoryyou would typically want a star to cover 2 or 3 pixels to be resolved ( and 3 pixels for a circular appearance).  So for 3 arcsec seeing, an image scale of about 1 arcsec/pixel would show stars as circular - and a smaller image scale (< 1 arcsec/pixel) would not show any more detail (oversampled), while a bigger image scale (> 1 arcsec/pixel) would start to lose detail (undersampled). In this example, an image scale around 1 arcsec/pixel is considered well sampled. 

In many cases there is a trade off between FOV and image scale: For example, if we look at the above case  where an ASI290MM is used with a short 182mm focal length scope to fit M42 into the FOV, we find that the image is undersampled for typical 2-3 arcsec seeing: The image scale of that set-up - about 3.3arcsec/pixel - is not sufficient to resolve 2-3 arcsec details that seeing would allow us to resolve with a longer focal length scope and the same camera. If the intention is to capture both a large FOV and resolve all the detail in an image that seeing allows, then the set-up may require both a longer focal length scope and a large sensor. If your primary interest is in capturing a wide field, such as  a large emission nebula, and don't mind undersampling (i.e. losing the finest detail), then a shorter focal length scope with a smaller sensor camera may suffice.

One other question that is often asked is whether a cooled camera is needed for EAA. One advantage of a cooled camera is that thermal noise can be minimized, and dark frame calibration for some modern sensors can be skipped if the camera is operated at a low temperature. These advantages also apply to EAA - using a cooled camera reduces thermal noise by the same amount in a cumulative live stack of short exposures as it does in a stack of longer exposures that have the same total exposure. However, thermal noise is just one source of noise present in an image, and depending on the conditions (e.g. ambient temperature, background light pollution and camera read noise) it may not be the dominant source of noise in an image. For someone doing EAA on a hot summer night, a cooled camera is definitely helpful in reducing thermal noise; in the middle of winter on a cold night, not so much. There are also advantages  of having a camera that can be operated at a fixed temperature from night to night (simpler calibration), as cooled cameras can. Many EAAers often start with uncooled planetary cameras, which typically have smaller sensors, and are inexpensive relative to larger cooled cameras, and then trade up to larger cooled cameras over time. 


Software

Any one using an astronomical camera to capture images of the night sky will be familiar with camera control software, i.e. software that takes an exposure and saves it and/or displays on a computer or other device.  For EAA, we also need software that can produce a real time view of the target, often through the live stack process of aligning many short sub-exposures to produce a cumulative live stack that shows more detail as time passes. For EAA, the process is usually interactive, with the EAA observer tweaking a live stack histogram to try and eke out more details as the stack progresses. Some people enjoy that interactive process, while others - typically the casual AP imager - tend to just want to "get" a final image to share with others. Either way, it is worth looking at what software is available to do this.

Some camera vendors have  their own proprietary EAA software which  will work only with their cameras. For example, Atik, Starlight Xpress, ZWO and MallinCam each have their own such software that can be used only with their cameras. In addition, there are a few independent EAA software products that can be used with multiple camera brands. The leading example of  independent EAA software is SharpCap, which provides native support of many types of cameras as well as support for cameras that have an ASCOM driver. However, SharpCap runs only on Windows machines. I have used several different EAA software applications over the years, including ASILIve which can control only ZWO's ASI cameras, and Starlight Live for SX  cameras. SharpCap has more bells and whistles than many of the proprietary software applications available and is currently what I use most of the time. If you have a specific camera, it's more than likely that the current version (4.1) of  SharpCap will support it. For some people, one downside of SharpCap is its limitation to Windows machines. As is the case with much astronomical software that runs only on Windows, there are some workarounds that allow Windows to run on machines that have other operating systems like Linux machines or macOS. Alternatively, another solution is to get a cheap Windows machine that can be dedicated to running SharpCap and other Windows based astronomical software. 

In addition to EAA software capable of live stacking, most people doing EAA will use other software tools in the course of an EAA session. For example, a planetarium program such as Stellarium, Cartes du Ciel or several other such programs can be use used to control a mount. If a planetarium program is used in connection with a plate solving program and EAA software, it is possible in one click to  select a target and through a "sync and center" point the scope so that the target is centered in the camera's FOV. This automatic process involves the mount executing a GOTO to the target co-ordinates, the camera then takes a snapshot of the FOV and the plate solver  determines how far the actual field captured by the camera differs from the target's location and executes another GOTO to correct the pointing error so the target is centered on the sensor. There are several plate solving programs which can be used with planetarium programs and SharpCap; my favorite was ASTAP as it is reliable and relatively fast in my experience. However, recent versions of SharpCap have incorporated SharpSolve, a plate solver  specifically for SharpCap that is written by Robin Glover, the developer of SharpCap. SharpSolve is the fastest and one of most reliable platesolvers that I have used, and I highly recommend it for anyone using SharpCap as their EAA software.

In the next issue, I will talk about the other two main hardware components - mounts and telescopes - to consider for an EAA set-up. 











Comments

Popular posts from this blog

No.2 - Getting Started in EAA, Part I

No.1 - Welcome to the EAA Universe