mk3 ScopeDog tested OK

Winterfest weather wasntgreat, but enough clear patches for me to test my new mk3 code. In summary the changes are,

  • All Java code rewritten into Python
  • eFinder code integrated into main ScopeDog code, hence running on same Raspberry Pi
  • ScopeDog can call for a plate-solve automtically at the end of a GoTo, and then refine the pointing. typically achieving about an arc minute absolute accuracy.

Considering how much had changed I was surprised and relieved to find it working so well.

But what was needed was a single hand pad to control all functions. The little 5 way navigation switch needed replacing too, as it was not good with gloves on!

Using the eFinder hand pad as the basis, it was easy to add the ScopeDog joystick to it. The Raspberry Pi Pico board in the hand pad is very flexible. I’m impressed with the Pico. It also gives me the ability to display some ScopeDog functions on the OLED text display.Here’s a shot of the new combined hand pad. The new 5 way switch was expensive, but I see now is worth it.

mk3 ScopeDog getting closer!

Now that the eFinder is working reliably, the next challenge was to integrate it furher with ScopeDog. A little while back I revised ScopeDog to use the latest components, including a Raspberry Pi 4B. The Pi would have more than enough capacity to run the drives and the plate-solving.

First issue was that ScopeDog was written in Java, and the eFinder in Python. This would make complete integration difficult given my coding expertise. So I converted the Java code to Python. 

That done I mashed together the new ScopeDog and eFinder codes. A few minor issues, but it worked almost straight away. The Pi OS & Python interpreter seems pretty good at managing threads and doing repeated plate-solves has no effect on the drives. The camera just plugs directly into one of the spare USB ports on the ScopeDog Pi.

Next job was to combine the two handpads. First attempt was to use the eFinder handpad as is, which meant using buttons to steer the scope. It worked, but I missed the light-touch joystick i have been used to. So I made a new handpad, with buttons for eFinder and display control, and a joystick for scope control. It included the eFinder OLED text display, which gives useful options to display ScopeDog data. Using a Raspberry Pi Pico in the handpad is giving a lot of flexibility in customising it for new features.

I’ll be taking the mk3 to Winterfest next week to give it its first shakedown for real.

Kelling Heath Autumn 2022

Back from a rather variable Kelling Heath astro camp. Observed for a while on three nights, so at least something!

Did get to try out some telescope changes…..

  • The altitude roller drive worked really well overall. However the scope does need to be better balanced. With my old belt drive I didnt have to worry much, but with a dew sodden shroud and a 21mm Ethos, the clutch slipped. Easy enough to add some weights to the back.
  • The new high precision planetary gearboxes on the stepper drives are noticeably better. Much less backlash.
  • The eFinder worked perfectly without a single failed solve. It got quite a lot of interest too.
  • I swapped my x4 Powermate for a x2. Much better suited to me and my eyepiece set.
  • Realised my telescope cover really is worn out. Its been good for about 6 years but I needed to be more careful about sharp edges on the scope! Now it leaks in the rain.

More eFinder developments

I think the development of the eFInder is nearing its end! The transition to Skyfield to do the maths went well. I then decided to use a standard Raspberry PI 32bit OS with a fresh build of installed, rather than use Astroberry. This makes available some useful new features. 

Producing a working local copy of isnt trivial I discovered, but thanks to Dustin via the user group I got there and can now reproduce the build quite easily.

This process led me to look at the capabilities of more closely and in particular the family of supporting programs. One of the most useful is the ability to directly obtain RA & Dec of any pixel in the image, not just the image centre. I re-wrote the offset calibration almost completely and the result is much more accurate and stable.

A very nice feature of the full build is the ability to annotate captured images with markers and labels for catalogue objects. The GUI control is now very comprehensive and here is an example screenshot

I’ve also switched to using a remote OLED display, rather than the LCD module mounted in the Raspberry Pi housing. The LCD got very sluggish at cold temperatures and wasnt very clear either. The OLED is very high contrast and perfect for dark adaptation. A 3m USB cable connects the display/control box to the Raspberry Pi. Full build instruction can be found here.

IMG 4969

eFinder code gets revisited

With more people using the eFinder in the field, some bugs have emerged. The list presents some lessons to the amateur code writer (me!).

Stupid maths errors (or poor checking)
Not enough attention to detail in interface protocols 
Relying on imported methods without checking their performance.
Implementing too many changes at the same time.

Biggest issue has been with my use of PyEphem to do the many coordinate transformations between RADec & AltAz. Originally I was doing these the hard way with my own code. Then I discovered PyEphem and it got a whole lot easier and neater. Only trouble is the results arent consistent. Spent weeks trying to get them right as there are many variable that can be set, but gave up. PyEphem was deprecated in 2003 and I guess its no longer using the right dependencies.  Fortunately the author had meanwhile produced Skyfield . This at first seemed very complicated but as usual with these things with a bit of patience I got my head around its principles and ended up getting sub-arcsecond consistency.

So I’m now in the middle of re-writing all the eFinder variants to use Skyfield, and incorporate the many bug fixes.

ScopeDog mini-batch

I needed to build up a mk2 to test fully, and decided I needed one on my own scope. Ended up building a small batch!

IMG 4912

Altitude roller drive progresses

I was happy enough with the prototype to build a ‘final' version. A new base board (I want to keep the old one in case I have to revert to the previous altitude drive system), a linear actuator to operate the clutch and a general smartening up all round.

The actuator takes about 0.5 seconds to operate and consumes no power once the state has changed. I changed the drive shaft flexible coupling to a more robust kind as I found the helical cut type was ‘winding up’ under load.

Biggest change has been to use the newly available ‘high performance’ planetary gearboxes that have abou 1/3 the backlash. They are bigger but still very compact for the power they deliver.

Still no chance to test it under the stars as mid-summer bright nights are upon me. It performs well in the workshop so I am expecting it to be much better than my old design. Plus it is more compact and easier to assemble.

IMG 4839

Here it is with base painted and with an acrylic cover for the drive/clutch assembly.

IMG 4862

Trying out a new Altitude drive

My 18” UC Dobsonian uses a toothed belt to drive both axes. The azimuth is neat and hidden below the base. The altitude drive is effective and efficient, but does look ‘unusual’! One of its great features is the adjustable slipe clutch - which is a priority for me.

I like the idea of the roller drives used by SpicaEyes and a few amateur builders. It drives both trunions together and could make the scope even more compact and easier to assemble. But could it incorporate a clutch and how well balanced does the scope need to be? Of course the answer is to build a prototype and experiment.

IMG 4790

Here is the prototype. No base yet, just the telescope mirror and trunnion assembly sitting on the rollers, mounted on one of my workshop benches. Seems to work OK, but I need some decent rubber rollers. Heatshrink over plain bushes just doesnt work well!

IMG 4791

Not installed yet is the actuator for the clutch. I’ll try it first as a straight friction-slip clutch but as a fall back I have a mechanism tested out to motorise it.

Real Time Computing challenges.

Re-writing ScopeDog into Python wasnt too difficult once I discovered the trick to getting a loop to maintain precise timing. If you’re interested, insert this into a while loop. This will ensure a precise 1 second period. (The rest of the loop can take a variable amount of time, as long as it is always less than 1 second.)

remain = 1-(time.time() % 1)

But once I started to connect and combine my eFinder and SkySafari, things got difficult. All three components, ScopeDog (the drive), eFinder (capture & plate-solving) and SkySafari (control and insatiable desire for position fixes!). Python wouldnt normally be used for real time computing applications, but it is very suited to astronomy calculations and nice to work with.

ScopeDog must maintain an exact 1Hz tempo to keep its tracking accurate, but this needs to pause cleanly when a goto, move or joystick actions is executed.

The eFinder needs time to get an image and solve it. This can take 2-5 seconds and may even fail.

SkySafari just solders on demanding positon fixes at least every second and throws an error if it doesnt get timely or well formatted answers.

Most issues were overcome by creating a separate Python thread for each of these three functions. This worked but still often one thread is waiting for results from another. This was then resolved by creating a 'virtual encoder’ scope position, shared by all threads. This positon is maintained by the 1Hz ScopeDog thread by using plate-solves results whenever available and computing a theoretical position inbetween times based on tracking and other scope moves.

Its working, just. At least well enough to try on the scope at night. 

ScopeDog mk3 edges closer

The digital finder is now working very nicely on my 18” Dobsonian. Celestron have a range of telescopes that use the technology to control their scopes, without the need for initial user alignments on reference stars. Would this work on my 18”?

A few quick experiments showed the concept is very viable. Without any initial alignment of my Dobsonian, I can push or drive the scope to a position, and within seconds the digital finder has worked out where it is pointing and sent the result to the ScopeDog drive to start tracking. This was done with some quick and dirty hack code. A proper solution is more difficult.

Currently ScopeDog uses Java but the plate-solving is most easily done with Python. Converting the java to python is proving to be fairly straightforward, and I am taking the opportunity to review some routines and make better use of ‘off the shelf’ astronomy code (PyEphem). Debugging is so much easier in Python too!

For now I am leaving the eFinder in its own box with dedicated Raspberry Pi. Eventually the eFinder code would go into the ScopeDog box - I’ve checked the Pi in there can cope with the extra workload. No Nexus DSC is needed (no encoders). Without a Nexus DSC, my ScopeDog needs its own gps dongle, and a direct connection to SkySafari running on a tablet or pc. These changes were easy although talking via code to SkySafari was a learning experience!

Anyone building their own ScopeDog (instructions start here) will be relieved to hear that the mk3 will be a firmware upgrade to a mk2. Realistically I need a year to fully complete and test the mk3, but if anyone wants to get involved early, contact me. (email on the first ScopeDog instructions page)

© AstroKeith 2022