Testing Optical Setups on 10" Meade
Optional - you Don't Need Virtual Dub Any More. But if you still want it and if You Don't Have VirtualDub Yet Installed...
* go to SourceForge to download the 32bit version of VirtualDub here
. It downloads a .zip file,
*
"save file" to your desktop.
*
In CamUnzip, it should be wanting to extract the files to the desktop directory for VirtualDub already created. Click in CamUnZip extract.
* From here, you may want to watch this YouTube video on installing and setting up VirtualDub
* You should now see a folder on your desktop called "VirtualDub...."
* To install the program, click Start and Computer and C: and Programs and Settings (x86) and right-click on the desktop VirtualDub folder and drag it into the far right empty space of the now-visible folders of all x86 programs. If you don't drag far enough into the empty space, it'll try to put your virtualdub folder into an existing program folder - don't do that! Drag far to the right into the empty space and release and a menu saying several options including copy here as the first option
* Click copy here and you should see the folder now appear in the x86 directory just like all other x86 program folders
* Click on the folder and see all the contents. Right-click on the gear-icon application near the bottom labelled VirtualDub and in the menu do create shortcut. It should ask you instead to create the shortcut to your desktop, which is what I do.
* You've now got VirtualDub installed!
You should already have Live Moviemaker as part of your Win7 package. To make a shortcut on your desktop, do this...
* Click Windows Start then All Programs then from the list go to the bottom and click Windows Live folder and you should see Live Moviemaker.
* Rightclick Live Moviemaker and then Send and then Create Desktop Icon
* You can now start to drag and group your astro-video-related desktop icons into your favorite corner of your desktop!
Importing your Video from your Camcorder
* Connect the firewire from your computer to the ZR45mc or other ZR camcorder, under the gray flap at the bottom of the lens front of the camcorder.
* Turn on the camcorder, in the 'down' (play) position
* If this is the first time, you'll see the computer try to find the drivers, which it should not require your help. Might take 5-15 seconds or so.
* You'll now see a prompt appear on your task bar which looks like this
* Enter a name for the .avi file which it will create, and select Choose parts of the video to import
* click next and it'll now turn on the camcorder into the pause position. You should see the camcorder control screen appear on your computer with an image of whatever is on your tape.
* Click the back/fwd buttons until the tape is at the position you want to begin importing. Especially for faint events, give it a good long time before and after the event, at least a minute, so it can get a good baseline brightness and variation. When you're there, click import
* Watch as it imports and especially look for more than 0 dropped frames; you'll generate about 120+ Meg's per minute. If you get dropped frames (I haven't so far), try unburdening your computer of other tasks and try again. Make sure you're downloading to your primary HDD, not having to go through a USB to a mobile HDD. That might slow it down and cause dropped frames(?)
* Click stop when you want to end it, and then finish.
* Your .avi file will appear automatically in the video folder of your computer. To find it, click at extreme lower left Start and Computer and you should see the Videos folder under Libraries. You can move it to wherever you want, like folder rawvids .
Using LiMovie to Get Photometric Data from the .avi File
David Herald's 2006 instructions.doc for LiMovie (notes from Tony, Brad...
- image rotation is not a problem for these ~2 minute periods. Maybe at ~30 minutes it would. Unless your star is almost perfectly overhead, in which case the telescope may indeed rotate significantly.
-
For asteroid events, click 'standard' for the "form of BKG-Area" box. That gives you circles. There are 3 circles.
-
The inner circle collects light from the star, the middle annulus is not counted, and between the 2nd and outermost circle, is the annulus which collects the sky background brightness.
- Use 'drift' with a radius of ~4 pixels if stable, or more if windy. Threshold 95, unless jumping around too much, then try lower
* Fire up LiMovie (download it here. )It's a zip file, if you unzip in in your Programs(x86) folder, Winblows will probably 'block' it. So, click on 'properties' and click unblock however you may have to unzip it somewhere outside of Programs(x86) and then do this and move it into Programs(x86) if you really want it there. Or, you can just have a separate folder like I do, called MyPrograms
* Make sure you have a version at or later than 0.9.98a2, or you can't seem to do linked tracking unless you use psf mode
If You've Already Installed LiMovie, Start Here
*
upper left file | avi file open
*
Here is the LiMovie Screen
* In the far right, change the Video time inserter to IOTA if you have an IOTA VTI as most of us now do.
* in Form of BKG box, click standard for asteroid events. This gives you circular apertures for photometry measurements
* Gamma Reverse Correction should be off, if you have your gamma set to 1 on the videocamera (as I do on my Watec 910hx)
* In the PSF box, perhaps uncheck the tracking and photometry options. Pundits find the tracking isn't as good using PSF option. That is NOT my personal experience, it's the opposite
* Look at bottom right corner. Object A is the target star+asteroid. Now right-click somewhere on the sky and in the box that comes up, select Object Add . Now left-click carefully on the actual target star to center it. You should see a red tiny circle and two blue larger circles - the color scheme for the target. It'll pop up a separate box called Operation Guide and you can verify the psf and whether it is target, tracking, or comparison stars
* if your target is very close to a brighter object, you'll likely have trouble as the software will want to 'snap' to the brighter object. To prevent this, make sure under the "Star Tracking" box that you have it clicked to 'off', then go to your target star+asteroid and right click and in the pop-up menu at the bottom, select 'set position only' and very carefully put the tip of the mouse arrow dead-on the target star. Be very careful; it won't 'snap' to the center of the star, you have to put it there.
* Along the bottom see radius inner outer. I set the radius to maybe 5, inner to 7, and outer to 25, if nearby stars permit. (radius is the radius within which to measure the target light, the annulus inner to outer must be free of bad pixels or stars; it's where the sky background is measured. If target very close to another bright object (like Umbriel next to Uranus!) then cut circles to be sure to not include the other object. You can also use the "avoid lunar limb" to get a half donut for the blue sky, or even a quarter donut if you go to the top menu bar 'options' / 'measurement options' and select cutting the top or bottom half off. You can then rotate the aperature with the 'direction setting' inside the 'star tracking' box so as to minimize contamination from your nearby bright other source.
* You need to make sure your stars are not saturated. Especially for the target (A) and comparison (C). Perhaps a little less important for the tracking star (B). Easiest way to check saturation is click an aperture set onto a star and then in the lower right click on Star Image [3D]. If it's flat-topped, it's saturated near the center.
* Now right-click somewhere on the sky and again Object Add and then center that on your tracking star. It'll be yellow circles. This is the tracking star, which is used in case your target star gets so dim that the software might lose track of it. If the target never disappears, then don't do 'linked' tracking but instead let it 'drift' on the target; it should be more accurate than linked tracking with the wobbling of the tracking star.
* Now right-click somewhere on the sky and again Object Add and then center on your comparison star. It'll be pink circles - the comparison star. You can make the comparison and tracking star the same.
* in the bottom right, click on star A (target) In the Star Tracking box click OFF
* In the lower bottom Current Object click on B
* In the Star Tracking box click drift. Be sure to re-do this right before you're done, because LiMovie has the annoying habit of resetting it to anchor when you're not looking!
* in the Linked Tracking Box click Link . You've now said the target star/asteroid tracking will be linked to the tracking star = Star B. Now, only use Link if the target is expected to completely or momentarily completely disappear so the circle may lose it on 'drift'. If the target is always track-able, then don't click 'link' and leave target A on 'drift'
* In the lower bottom Current Object click on C
* In the Star Tracking box click drift.
* Go through and click the A, B, C boxes one last time and make sure the parameters are what you set - again, LiMovie seems to like changing B back to 'anchor' even though you set it to 'drift' . Also make sure that the VTI is set to IOTA as it has the tendency to revert back to KIWI when you're not looking! You'll know this error when you see no times on your csv file.
* Position the video to the beginning, and click Start and watch the data lines get generated. Make sure the data times are also recording in the stream.
* When finished, first thing to do is at the lower left click GRAPH and look at the light curve. If it looks funny, click DataRemove and start over.
* If the light curve looks good, then click SaveToCSV-File and save the file. This is the file that Occular or other software will need analyze for the timings.
* Before leaving LiMovie, zoom in on light curve at the beginning and determine the number of points in an integration. If you integrate N frames into a single point (i.e. you set the Watec 910hx to mode 2Nx (fields), then you'll see the light curve in chunks of N frames of nearly but not exactly equal brightness.
Run PyOTE to analyze your CSV file to get best fit Timings (this has become the standard now)
* Click on your desktop PyOTE shorcut, wait till Python brings up the window for PyOTE
* Open AVI file
* first let it find the integration setting automatically, by clicking 'block integrate' and 'accept' if it does it right. If not, then just zoom in and take what YOU think is a block, and mark with a little red dot the beginning and end, then click 'block integrate' and accept if it finds the right values.
* Then select D region by bordering it, generously if you can. Then the R regions. The 'search for occultation'
Run Occular V4.0 - Will Statistically Analyze Your .csv File To Find the Best fit to Your Timings
-- select file and get your file. It'll show the .csv files
-- click on the gray box left of the header line with No. next
to it, immediately above the first data line. This should highlight the entire
top row. Make sure the entire row is highlighted. This tells Occular the headers
titles
-- find the Object1 Result Measurement column and
click on the first data entry to highlight it.
-- use the vertical positioner to go all the way to the last data entry in this
column and Shift left-click on it, to highlight the entire column
-- click Analyze Data and it'll warn you you need
a time stamp, click yes and you then navigate to the
row 1 column 1 entry (which is frame number 0.0) and double-click on it to bring
up a box for you to work with.
-- follow the format they suggest and enter the center-of-frame e.g. 09/10/2017 03:47:48.6003 month/day/year/hr/min/sec in
the format line (not boxes) provided, then click get time from entry. Then OK.
It will populate the individual boxes for date, hr etc.
*** What if you don't have good time stamps on some of your video?
**Then you need to realize some things... (1) the "center of frame" time which is what you want to see on the LiMovie .csv file, is, on the raw video, the second of the two field times shown for the IOTA-VTI. **Go to somewhere in your .csv file where you know there is an accurate time stamp on the raw video. Note on the raw video the hr, min, sec and fractions of a second in the SECOND (i.e. right-most) fractions of a second (you'll see this is 1/60 sec earlier than the FIRST (left) time stamp on the raw video for EIA (NTSC) video cameras).
-- click Analyze Data. You may find it fails the data integrity check. Occular verifies that your captured VTI times are spaced .033 or .034 sec apart. If there are any frame spacings which are different than this, it'll fail the integrity check. For me, I find that LiMovie reads my Kiwi time stamps sometimes confusing "6", "8" and "9". It'll flag them and they're pretty easy to see what the digits should have been read as, from the records on either side. Click on write 'error marked' csv file and use a different name so you don't lose the original.. This 'error marked' file will have added a new column 1 which has "red" in any row you modified. This added column will foil being able to use this same file for reduction. So, you open Excel and open both the original file and the error marked file and use the error marked file to locate the records needing to be fixed on the original. Fix the original and save it. I don't see errors from the IOTA-VTI. It has never given me a fail on integrity check, in the many years I've now used it.
--Again click Analyze Data. It should bring up the analysis panel, which
includes your light curve and parameter boxes.
If you used a camera integration setting of 2x (=frame integration of 1 i.e. no integration), then skip down to the "------------" line. If you integrated above 2x, keep reading.
-- if you used integration of frames, then on the screen showing the table of data in csv form, check enable running/block averages option, and then check block average, and the num pts=4 (for an 8x setting on Watec).
-- for offset, look carefully at the data and find a point when the scintillation made for a large change in brightness, so it's obvious how the points are grouped by your integration. Take the first frame number of that group, subtract the number of the first frame in the LiMovie .csv file, and divide the result by N where N is your frame integration number (Watec mode setting 8x (8 fields) means frame integration N=4 frames, since 2 interlaced fields per frame. Etc.). So: frame#(begin of an integration) - frame#(first record in LiMovie .csv)/N =integer+remainder. The "remainder" is the "offset". So for example, the "D" in the Eukrate15 video (which starts at frame #0) showed a partial drop with 4 frames all about the same, beginning with frame 1705. 1705/4 has a remainder of 1. That's the offset. It'll be a number between 0 and 3 in this case. Another example - Katja on Dec 31, 2017. The first frame number in the .csv file is 981, and a block of 4 pts (since it was an 8x = 4 frame integration time) begins at frame #988. So, 988-981=7 and 7/4 has a remainder of 3. So offset=3. Now, use that offset and see if that group of frames that is one integration is faithfully reproduced when you apply the block average and offset to the data. If it fails to reproduce, try other offsets and see what gives a faithful average for each block of integrated frames. You can
click drag a box around the part of the light curve you want displayed at higher magnification, and LiMovie will let you left-click on a point and it'll give you the frame # and the brightness value..
--
Now that the paremters are set, you can let Occular search through a range of occultation
time lengths to search for a range which includes an obvious occultation. Here's how to set it...
-- Note that the light curve is plotted with frame number along the bottom, even when block averaged.
---------------------------------------------------------------------------------------------------------------------------------
--
for Event Bottom it wants the duration of the occultation. Look at the light curve, for minimum and maximum enter the shortest and longest durations
it could be, in frames... BUT If you have block averaged, say with 4 frames per block average (e.g. 8x set on the Watec), then you must divide the min and max by 4. If you don't, it'll probably not find the occultation and the Calculate MaxFOM will remain grayed out. If you've framed averaged, there will be fewer points on your light curve and the analysis will go quick. If not, it may sit there for half a minute before it finds the best solution.
-- for Wing Length choose a number of frames as long as possible but less
than the un-occulted time length before the D and after the R, but divide first by the integration (e.g. 4).
-- click on Mag Data Input and it'll pop up a box. Fill in the available
magnitudes of the star, using Steve Preston's "detailed information" link on the event, or the data on the US map.
-- click write to analysis panel and it'll send the magnitudes to the analysis panel
-- click Calculate MaxFOM to calculate
the best figure of merit from the possible occultation parameters you've entered.
It may take 10-30 seconds to find the best timings. Click on show noise histogram and you'll see the
distribution of errors to the model, and it should look single-peaked and roughly
Gaussian. Do PrtScr and CNTRL V to cut/paste to Photoshop for the graphs you
want.
--click final report to generate 100 solutions to the best D,R timings and accuracies. It'll take up to a few minutes to do that.
-- on the final report panel, click write panel and it'll create a .png image which you can play with in Photoshop for display on your webpages
Reporting Observations to IOTA
* Use OccultWatcher to populate the spaces in the report form with the right names/dates etc. However YOU must put in the timings of course.
* And, YOU must correct your times from those reported on OCCULAR's "final report" by the following correction determined by Tony George here. Here is the relevant table for both the Watec 910hx and the PC165DNR; both are the same table. For Occular, there is no VTI Time correction (=0). The table below for the Watec basically says the true time was one full integration of time before the Occular-determined time. e.g. camera mode = 32x fields = 16x frames = ~1/2 second of real time.
Videocam setting as shown on LCD screen of camcorder | Camera Delay time corr (sec) |
no integ | -0.0167s |
2x (fields) | -0.0334s |
4x | -0.0667s |
8x | -0.1335s |
16x | -0.2669s |
32x | -0.5339s |
64x | -1.0677s |
128x | -2.1355s |
256x | -4.2376s |
And for the PC165DNR here's the values
Videocam setting | Camera Delay Time corr(sec) |
no integ | |
2x | |
4x | |
8x | |
16x | |
32x | |
64x | |
128x | |
256x |
Using OW to Prefill your Report form
* fire up OW
* right-click on your asteroid event
and select report observation
* prefill report may be grayed out. No worries, click on report observation and it'll ask if you want to make your official report and you answer yes
* in the information box that shows up, fill it in.
* if it's a <new configuration> don't worry about naming it yet.
* When you're done, it'll ask you for a configuration name
* Then it will make the Excel report form out and ask you to save it somewhere
* Then Excel will start up and your pre-filled form there. Complete it, save it, and then send it to Brad Timerson, and attach the .csv file of the lightcurve.
Brad's page here.