Here is how it is supposed to work:
The satellite is transmitting the time signal. If the satellite is in geosynchronous orbit the signal will take about 0.240 seconds to reach you. If your satellite is in a lower orbit the signal will take less time to reach you, approximately 0.100 seconds... If you can only pick up one satellite and you know both where you and the satellite are, you know the distance and delay and can work backwards and get accurate time. If you don't know both locations, you don't know the signal delay and therefore, can not calculate the real time, but you can guess within a quarter second.
Once you can see three satellites, you can calculate the delay from each one and triangulate your position. This gives you an accurate time fix.
Your electronic device has it's own clock. This clock will keep good time, but it will not be exact. Over a period of time this clock will drift away from real time. Lets say that at noon your device's clock was synchronized to GPS time and that 5 hours later, when it was synchronized again, the time had drifted by 10 "ticks" away from the real time. A well implemented timing service would then realize that every half hour that it needs to adjust it's time by 1 "tick" to keep up with the GPS clock. This is known as a "GPS Trained " clock and they can be extremely accurate. I have one in my lab that is accurate to 10^-14 seconds. (0.00000000000001)
Most commercial devices use standard modules that communicate with the device through a series of standard commands. For most GPS modules, these commands are standardized as NMEA, and the time command reports time with a precision of 0.01 seconds. That is as accurate as you should be able to get with your camera. The least accurate devices out there have internal clocks that are accurate to 10^-7, and that's a drift of about .08 seconds per day. Being off by a second is a big deal and indicated that something is wrong.....