So I’m doing up my discussion section for one of my short lab write ups, and it asks for another way to calibrate the ocular micrometer other than using a stage micrometer or other type of ruler. My idea was to take an already prepared slide with known cell sizes and use them to calibrate the ocular micrometer using the equation:
apparent size of object = magnification*actual size of object
From that we could then determine the distance between each ocular unit for each magnification setting. Does this make sense? Anyone have a better approach cause I can’t think of anything better.