While trying to figure out a simple way to convert solar radiation (W/m2) to average surface temperature, I’m having a particularly bad Google-day:
Me: Hey Google, how do I convert W/m2 solar radiation to average surface temperature?
Google: Want to buy tanning lotion? No? How about a SpectroRadiometer? No? Hey! I bet Wikipedia has the answer!
Wikipedia: Blah! Blah! Blah! Blah! And then Blah! Blah! Blah and…
Me: Arrgh! Show me your face and I will punch it!
Yeah! Yeah! Yeah! I know I’m punching above my mental weight class here but I have questions that need answers!
Ok, so I went with the naive approach.
The average W/m2 for Earth is 1366 (or 1373… there are some other webpages that need to have their face punched) and the average surface temperature of the Earth is 23°C… so dividing my W/m2 with 1366 (or whatever) and multiplying that with 23 would give me an approximate estimate.
And before anyone starts throwing Wikipediaesque algebra at me… I’m building a fictional world, for a book project, and if I ever came near starting to talk about watts per square meter my fiction book project would have a seizure and die. (As would the reader… or at least their desire to read…)
No puppies or babies will die if I get the temperature wrong… no puppies or babies even inhabit my fictional worlds… well ok, maybe fictional babies and puppies but no real babies or puppies!
Or wait! What the hell? I’m not writing about puppies and babies! Come on!
LikeLike