What do you do if you need to display a time in the web browser that is synchronized to the server time, or better yet when you need multiple browsers to be synchronized to the same time? The answer, of course, is that you need to synchronize them to a single reference time and the obvious choice is to sync to server time. While it's not actually possible to change the time on the connected clients, what you can do is calculate an offset from the server time and then adjust the client time accordingly.
I find that making an ajax call to retrieve that time from a PHP function works just fine, but there are other ways to get the server time value. In fact, you can get a time value from a server just by making an ajax request and reading the time from the headers so that you don't even need to run anything on the web server itself.
Once you retrieve the server time, you simply subtract one time from the other to calculate an offset. It doesn't really matter which way you do this calculation; i.e. Server time - browser or vice versa, but you have to be consistent. So if you calculate your time offset by subtracting the browser time from the server time, you have to add this offset back to the browser time to get the server time. This is the part where I find drawing a simple diagram helps me keep things straight!
I mentioned before that microseconds were a bit more precise than necessary. Let's face it, if microseconds are important in what you're programming, you're probably not doing web development. More likely you're doing some kind of real-time control system in which case none of this is relevant to you, so please just stop reading. For the rest of you who are doing web development, you are certainly aware that things on the internet take time to get from the server to the browser. Whether due to server load, network latency, or other factors, it takes time for the server to respond to a request and for that response to reach the browser. So, what we really receive from the server is a value that tells us what time it was a very short while ago. Depending on how accurate you need your time to be, these few milliseconds may not matter to you, but in case they do, the time has to be adjusted to take this response time into account. The easiest method is to just subtract the time at the beginning of the request from the time at the end of the request and divide by two, but you can go further and measure several request times. There's no need to take an average as the shortest response time will give you the most accurate result.
If all of this seems like a bit of a pain, I've got good news. Someone else has already done most of the work for you. It's still up to you to do something useful with the server time once you have it, but the ServerDate script by David Braun will get you a pretty accurate value for the time on the server.