Visual Basic 3 (and/or 6? not sure) used 'dips' (device-independent pixels) as the standard unit of measure and it would always mess up my pixel-perfect layouts. I wrote this sort of calibration tool to figure out the number of dips per pixel on a system but what an idiot I was! You could just go to the form (window) and select pixels as the unit of measurement. Well I was 12 or something so I have an excuse.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Visual Basic 3 (and/or 6? not sure) used 'dips' (device-independent pixels) as the standard unit of measure and it would always mess up my pixel-perfect layouts. I wrote this sort of calibration tool to figure out the number of dips per pixel on a system but what an idiot I was! You could just go to the form (window) and select pixels as the unit of measurement. Well I was 12 or something so I have an excuse.