Complete page on PyMovie at the IOTA website
* Open PyMovie
* In center box of options click 'file/folder' tab. Beneath this you'll now see choices. Click 'open AVI..." file
* Click on your avi file name after navigating to it
* In center box just below, now click on 'select folder' and select a pre-existing folder within which it will create a new folder for the creations of PyMovie. You must do this first, before any analysis or other actions. For me;
I've just been letting it create its working folder in my flashdrive d:\occvideo and it'll use the same name as my video. It creates a working folder.
* PyMovie starts out defaulted to 'view avi fields' box checked in upper left. That means it will show both fields of a frame, one on top of the other. I assume you've already trained PyMovie to read your time stamps from the VTI. If not, follow the link at the top of my page, at the IOTA website.
* If you have already set your OCR'ing of VTI numbers, just un-check the 'view avi fields' box and it'll then show the regular looking video frame
* Use the arrow positioning controls below the video image and move through your video to where you want to start your data analysis of the recording. I find I have to advance a vew frames before data shows up.
* Click "Mark" and that bookmarks that moment on the tape, so you can return to it later.
Making a Finder image
* If your target is very dim, you may need to create a 'finder', which is a stacked set of frames made into a single image in order to set your apertures accurately, so...
* Click on "finder" tab and in that tab, click on "align: star"
* Redact: for my IOTA VTI3 as it's configured, I use top=0 and bottom=122 and that will trim off the VTI numerals, which is necessary in creating a finder image
* "Num frames": I use 111 frames to stack, just because 111is easy to type and it's do-able pretty quickly and is virtually always enough. It's not a critical number. Use what you want.
* Now right-click on a good star for tracking; not saturated if possible, but not too faint, click 'create snap-to aperture' and you must give it the name "stack"
* Click 'generate finder' and it'll ask you to make sure you have redacted (hidden) the VTI numerals, say 'yes', or re-do if some of the numbers are still visible.
* Now click "generate finder" and it should march forward and stack your ~111 frames
* In upper left, check box 'Show image contrast control' and pull down the purple slider on the right till it looks bright enough for you to now make our apertures. You should have a good gray sky with clearly visible stars.
Setting your Apertures, whether on a 'finder' or else on a convenient early frame image...
* First right-click on blank sky and make an aperture and call it something obvious. I call mine "sky", others use "nostar", but it doesn't matter.
* Next right-click on your tracking star and in the dialog box click 'add snap-to aperture'. Call it 'track' and color it yellow from the dialog box
* Next right click on your target star and call it 'target'. If your target star is bright enough to follow well by PyMovie throughout the event, then make it a 'snap to blob' aperture. If faint, then make it a 'static' aperture. I use 'static' apertures for most of my targets as they are faint, typically; as recommended by the programmer Bob Anderson. However, if you try to make a 'static' aperture on a "finder" field, PyMovie has a bug and it will not obey, it will make it a snap-to-blob apertures which only becomes 'static' when the star is too faint to reliably follow. Until that bug is fixed, if you want to use a truly static aperture, you'll not be able to use a 'finder'. One work-around is to use a finder and find the relative coordinates on that 'finder' of your target star, and write them down. Then don't use the 'finder' go to a single frame at the beginning of the 'finder' stack, and position the target aperture on those coordinates.
* If there's wind, so that tracking and target stars do the same blurring, you'll want to make the mask created for the tracking star to be mirrored for the target star. Also if you think your target is going to be too hard to follow or there's a contaminating star and you don't want to 'snap to' for fear it'll snap wrong, then go to the "Misc." tab and check 'yellow mask=default'. That will make the frame-to-frame aperture masks identical for the tracking and target star; same shape and size and number of pixels.
* You must perfectly center this 'static' aperature on your target star, so right click on your target star and 'enable jogging' and use keyboard arrow keys to position the "X"
so it's centered on your target star, then right-click and 'disable jogging'.
* Now right click on your tracking star again and in menu 'turn green to connect to spinner' and in upper left controls area "set mask (mskth)..." to about 10, and try going up or down a bit to see what captures all of the star but doesn't wander too far out into noise, by watching the lower right corner yellow aperture box change as you change mskth.
* When that's done, be sure to again right-click on that aperture and 'turn yellow (=tracking star)'.
* Now right click on your target star and look at the yellow aperture box to see if that mskth number is fine for your target star. But if you've already clicked "yellow mask=default" so that the masks are enforced to be the same for target and tracking, then you just have to accept that number. For a tough event like Arecibo June 30 '21, that's the case.
* Check that your auto-created "sat. pixel value" is reasonable. Should be about 200 or not bigger than 256 which is 8-bit limit. Pixels above that value will show as red on your video close up as you analyze.
* Click 'analyze' and watch it go. Move your mouse over the chosen aperature to see it magnified below.
* Then click the 'plot' button underneath the main video image and it'll make several plots you can view.
* Then don't forget to click 'save to CSV' to save your photometry file. I save it to my Lexar flashdrive.
You're now ready to run PyOTE.