Sciencemadness Discussion Board

What is stronger? Sunlight or artificial light?

RawWork - 8-4-2018 at 05:55

I won't tell you why am I asking this, because that deserves another topic/thread.

I am asking this. What is stronger per same area? Light from sun in best case (summer) or artificial light in best case (from burning charcoal or lamp or torch or arc).

One additional comparison question is this. What is stronger? Sunlight or thunder light?

Of course I am not asking about theory, like what has more energy or what is bigger or what will release more light in space or somewhere else. But what will release more light on us, on me, on some object I am working with.

I know that sun releases about 1 kWh/m2.

I am thinking practically for heating purposes. I think it's anything artificial because it's solid. That's why we have to use face shields and other equipment, correct?

Sulaiman - 8-4-2018 at 06:21

There is no answer to your question because it is vague,
what artificial light source, at what distance, over what area ?
e.g. Stadium lights at close range would kill most plants, a laser would burn through cellular life.

P.S. The sun delivers approximately 1.5 kW/m2 above the atmosphere and 1 kW/m2 at ground level.

NOT 1 kWh/m2 which is a measure of ENERGY per unit area.
ENERGY and POWER are NOT the same thing.

DavidJR - 8-4-2018 at 06:30

Sunlight, by far, is 'stronger' than the most frequently encountered artificial light sources. The human eye has an extremely nonlinear response to light (which is actually a good thing as it means we have a huge dynamic range) so it can be difficult to appreciate the difference in irradiance.

Try playing with a fully manual film camera (w/ light meter) and you'll see that in general you need a lot more exposure for bright indoor artificially lit scenes than for a sunlit scene on a sunny/slightly overcast day.

RawWork - 8-4-2018 at 06:30

Well, let's say I want to build solar furnace to get 4000 degrees celsius. If I use artificial strong light, i may be able to build it using much smaller lenses than the one is built on our planet, which is too big? If I use same lenses, the one which focuses light from fire will give stronger heat at opossite end. Excluding possibility of direct heat transfer from furnace, that is if lenses are far enough from normal fire, so that only light plays role here.

I am saying this because I don't care about getting free energy, at least not in this case. I just want extremely high temperature to melt, vaporize, react, reduce etc. Best of all light is cleanest, no fans, interactions, impurities, sounds, gases... I can build normal furnace and focus red hot charcoal light?

Yeah I wanted to say kW only. Obviously if I said kW you will imagine it as kW for one hour, as it's easy to convert. Forgot it. Sorry when I edited this line of text I had to reedit it 2 times, because I wrote kWh twice accidentally instead of kW. Probably because I got used to calculating electric energy price.

"The rays are focused onto an area the size of a cooking pot and can reach 4,000 °C (7,230 °F)"

https://en.wikipedia.org/wiki/Solar_furnace

Yes I noticed that. But that is probably because incadescent light source is only small percent of light and rest is heat. If sun is stronger, then why can I watch it with eyes and can't watch red hot charcoal or steel? Why people have to wear face shields when working with torches? How can lasers damage or kill somebody? I have feeling that it's opposite.

[Edited on 8-4-2018 by RawWork]

NEMO-Chemistry - 8-4-2018 at 06:42

It takes tiny parabolic mirrors that are positioned constantly by fractions of a mm. There was (maybe still is), a small pilot plant in a desert somewhere that melted salt as the heat sink. A frenzel lens will get very high temps as well, but go careful these things are not toys and serious burns are a real possible.

Sulaiman - 8-4-2018 at 07:44

Have you considered an electric arc for high temperatures ?

P.S. Heat radiation varies as (area)x(T4) https://en.wikipedia.org/wiki/Thermal_radiation
so enormous power is often required to heat even a small surface area,
consider for example the tungsten filament area in a filament lamp.

[Edited on 8-4-2018 by Sulaiman]

RawWork - 8-4-2018 at 08:31

No, man. Electric arc furnace is no different, actually is worse, than filament or any other. Worst of all dirty, dynamic, wasting...
But I am not here to discuss furnaces, but simply possibility of making stronger temperature using artificial light instead of natural light.

This could be asked in different way, which would i think receive same answer. Can solar cell give more energy if placed near fire or some light instead of in sunlight?

aga - 8-4-2018 at 08:42

Some torches are quite bright these days :
https://www.youtube.com/watch?v=pMSyGOoesfM

Plunkett - 8-4-2018 at 09:18

There are xenon arc lamps that have similar output to that of the sun at short distances but they cost in the thousands of dollars, require a special power supply and cooling system, are pressurized to several atmospheres so you have to wear blast gear when handling them, and only last around a thousand hours.

[Edited on 8-4-2018 by Plunkett]

Morgan - 8-4-2018 at 09:22

Quote: Originally posted by aga  
Some torches are quite bright these days :
https://www.youtube.com/watch?v=pMSyGOoesfM


Is the laser using air or oxygen in conjuction with the photons. Quite a light show.

Deathunter88 - 8-4-2018 at 09:22

Quote: Originally posted by RawWork  

"The rays are focused onto an area the size of a cooking pot and can reach 4,000 °C (7,230 °F)"

https://en.wikipedia.org/wiki/Solar_furnace

Yes I noticed that. But that is probably because incadescent light source is only small percent of light and rest is heat. If sun is stronger, then why can I watch it with eyes and can't watch red hot charcoal or steel? Why people have to wear face shields when working with torches? How can lasers damage or kill somebody? I have feeling that it's opposite.

[Edited on 8-4-2018 by RawWork]


I think you would benefit from doing a lot more background reading before jumping to your conclusions. People have to wear face shields when working with torches because of the ultraviolet light that is produced. Lasers are powerful due to the coherent light that is concentrated on a small point. And you shouldn't look directly at the sun either, or you may suffer eye damage.

RawWork - 8-4-2018 at 12:02

Why am I having feeling that all you all are trying to say is that sun is stronger?
Isn't the sun the weakest?
Can't infrared light and heat also damage eyes? Don't we have to wear face shields because of that too, and not only because of UV?
If sun is strongest, then why i don't feel any damage (heat or uv)?
And don't worry about looking directly in sun, I looked into it directly for hours after reading it's beneficial for health in form of "sun gazing" and "sun tanning". Of course authors recommending that said it's best to do in morning and evening, that is when sun is weakest, no rapid change to be expected, no extreme exposures.
Isn't the laser stronger than sun? If sun were stronger it would cut things, correct? Laser are strongest? I can make 4000 C using lasers?

LearnedAmateur - 8-4-2018 at 12:43

How powerful is the laser? If you’re talking about a 1mW pointer or even a 100mW burning laser then of course they’re going to do some damage when focused (especially to the eyes), just as the Sun will when focused. When you stand outside, the rays are collimated but there is no natural mechanism for focusing them onto your skin so it doesn’t immediately burn like a strong laser. Grab a magnifying glass though and it’s a different matter, which does about the same damage as the focusing plane of a 100mW+ laser (stinging the skin, melting black plastic, scorching leaves). Make that lens larger, like a 1m Fresnel lens, then you have a large enough power area focused into a small enough point that you can literally melt metal and stone with it. It all depends on the power you are dealing with over a certain area - lasers are low power over an extremely small area, whereas sunlight is a higher power over a larger area, but comparing the area in W/m^2, the laser wins out unless the power from the sun’s rays is focused using a lens.

aga - 8-4-2018 at 12:51

Quote: Originally posted by RawWork  
... i don't feel any damage (heat or uv)? ...

You feel no heat from the Sun ?

What Planet are you on ?

sciencemadness.org is (currently) based on Earth, a small planet out on the edge of an insignificant galaxy.

Edit:

A quick chat with Polverone confirms that he had a chat with Elon Musk and we should be based on the Moon by 2020, also a Mars data centre by 2023, IF Musk's car crashes there as intended (the boot is full of servers).

[Edited on 8-4-2018 by aga]

RawWork - 8-4-2018 at 13:18

I feel heat, but not enough to melt common metals. I am still alive you see. About 30 celsius is highest temperature I ever felt from a sun. I am asking this:
If sun is at its highest possible temperature in my area, let's say it's 40 degree celsius, and if strongest laser on planet are compared over same area or object, who is stronger? Or if both are over same lense.
Simple question.

Or simply, what will take less space to build if i want to melt metals using light? Sun or some artificial device like laser? I read already about solar furnace, but that one which gives 4000 celsius looks big. Will laser give smaller?

Morgan - 8-4-2018 at 15:20

I wonder how intense sunlight would be if you took into account the inverse square law and measured it from a short distance out from the surface of the sun?

elementcollector1 - 8-4-2018 at 15:43

As others have said, you're conflating energy per unit area with energy alone. The Sun puts out an absolutely staggering amount of energy compared to anything humans have ever built, but because it's all emitting radially outward and we're quite far away, the amount that reaches you specifically is lower than anything a laser, torch or furnace might output were you to get near one.

Therefore, the answer to your question largely depends on the melting point, size and mass of whatever you're trying to melt. If it's just a tiny bead of aluminum, no bigger than the size of a pea or so, a Fresnel lens and a good sunny day should be fine. But if you're trying to operate a larger-scale furnace or trying to melt things like stone or ceramic, I would imagine other approaches to be a better use of your time.

happyfooddance - 8-4-2018 at 16:07

On the youtube channel "king of random", there are a few videos where Grant uses a screen from a projection tv (which shouldn't be too hard to find these days). Check it out, he melts glass, concrete, and quite a few metals with it. Hard to beat with a homemade laser, for myself at least.

Twospoons - 8-4-2018 at 23:14

Just for fun I thought I'd work out the power flow from a "cooking pot" at 4000C, assuming an emissivity of 1. If the "cooking pot" is approximated by a 20cm sphere then the surface area is about 0.5 m^2.

At 4273K the heat radiated form the pot works out to 9.45 megawatts (Stefan - Boltzmann equation). So you would need that much power input to account for radiated loss alone, in order to reach 4000C (thermal equilibrium). In sunlight terms (@1000W/m^2) that would require a mirror of 9,000 m^2, focused to a 20cm spot - which I'm not sure is even optically possible. Or you'd need a ridiculously high powered laser. Maybe the NationaI ignition facility would let you play with theirs?

Even reducing the emissivity of the cooking pot to something slightly more sensible, the power involved is still crazy.

Maybe try heating something smaller - much smaller.



[Edited on 9-4-2018 by Twospoons]

[Edited on 9-4-2018 by Twospoons]

Tsjerk - 9-4-2018 at 03:17

Am I correct when I say a perfectly black object, hovering in a vacuum will reach the temperature of the surface of the sun, when exposed to sunlight? In theory.

j_sum1 - 9-4-2018 at 04:09

Quote: Originally posted by Tsjerk  
Am I correct when I say a perfectly black object, hovering in a vacuum will reach the temperature of the surface of the sun, when exposed to sunlight? In theory.

No. It will give off black-body radiation.

LearnedAmateur - 9-4-2018 at 04:18

It’ll give off radiation, but won’t it reach equilibrium? Reaching a certain temperature through absorption whilst preventing it from continuously gaining energy through radiation.

Sulaiman - 9-4-2018 at 04:24

If the object to be heated was inside a large perfectly mirrored sphere,
with a hole just large enough to illuminate the object,
then I think that the object temperature would approach the observable temperature of the Sun.


Tsjerk - 9-4-2018 at 04:58

@Sulaiman; I think that was the theory I was after.

RawWork - 9-4-2018 at 05:03

Come on, people. If that were true, then all planets like would be suns. Maybe not all because some block light from coming to others, but at least then Mercury would be. Then you have to consider that these hot planets would radiate their energy to colder ones, so they will never be of the sun temperature. :cool:

LearnedAmateur - 9-4-2018 at 06:11

Except none of the planets are black-bodies, which are objects that absorb radiation perfectly hence their name. Planets directly reflect a lot of the light received by the Sun, especially ones with lots of white like Earth’s clouds and icy parts.

Vomaturge - 9-4-2018 at 13:45

If you had a perfectly black object in a vacuum exposed to sun, it would get hotter and hotter until it emitted as much energy as it was absorbing. The sun is about 5800 Kelvins (~5500C). At that temperature, each square centimeter of the sun emits about 6.4 kW of light, infrared, and UV. If you exposed a "blackbody" in a vacuum to that much heat and light per square centimeter, it would heat up until it reached the temperature of the sun. To get that much heat and light, it would have to be very near the sun, maybe even in it (yes, I know it would also not be in a vacuum, and would have convection or conduction heating too if it was in the sun) But, if you were further away, like near earth, the insulated blackbody would only get maybe .14 watt per square centimeter of heat from sunlight, because the sun is so far away and the view factor (https://en.m.wikipedia.org/wiki/View_factor) is very low. The object would heat up until it was radiating .14 watts per square centimeter, which is the same as 1.4 kW per square meter. To radiate that much heat, it would have to heat up to about 396 Kelvin, or 123C. But remember, sun only hits one side of an object at at time. If the "blackbody" absorbed heat on one side, but emitted it on both sides(e.g.. if it was a good thermal conductor, or changed orientation regularly), it would only have 700W per square meter to radiate, which would require a temperature of 60C. If it was a sphere, it would have four times as much area as was presented directly to the sun. To radiate the average of 350 W/M^2 it's temperature would be 280 K, or 7Celsius, with some spots hotter and some cooler. Hey! It's about 5C outside right now. I suppose the Earth is almost a blackbody, and the part I'm in is at an almost average position at the moment.

Okay, back to RawWork's original question: which source of light/heat rays would work best for heating stuff? Obviously, you can get a lot stronger of a source of thermal radiation from a heat lamp, burning charcoal, etc if it's up close than if it's far away. If they are close enough, they might be more intense then the sunlight reaching Earth. However, if you try to focus a light source using a lens, the size of the focused spot depends on how large and close the source is. If you have a magnifying glass or magnifying mirror, you can give this a try right now. Try focusing the light from the sun into a bright, hot, point. Then, try with a frosted or fluorescent light from a distance of 2 or 3 meters. You will get a much bigger (and probably lightbulb shaped) spot of light. Now, if you bring the magnifying glass very close to the bulb (like 20 cm), you will have a hard time focusing it, and if you do get it to focus, you will get an extremely large spot, but it won't be any brighter. With a filament bulb, you might be able to get a smaller brighter spot, but it will never be as bright as the spot from focusing the sun. Even if the light source is put so close to the lens that it is stronger than the sun, that will just create a bigger spot. It won't be brighter. Another thing to keep in mind is that most of the energy from a lower temperature source like burning charcoal will emit more infrared energy then visible light. A glass lens will absorb this energy, and get hot, but it won't transmit it. If you use a lens to focus the heat from burning charcoal, you will end up with a rather large spot of warm light, that likely won't even be hot enough to boil water. If the source is a barbecue sized charcoal fire a meter from the lens, the spot might even be cooler than the radiant heat without the lens. You can focus radiant heat from a concentrated but distant source, but you can't focus an up-close diffused source very well, even if both are just as hot and bright without a lens.

Lasers are kind of in a class by themselves. They can be focused into a very fine point. They also can be emitted in a fine beam, which can travel huge distances while staying bright. They would seem like an ideal way to heat things. Unfortunately, they are not very efficient, and the ones which are powerful enough to be useful are big, expensive, and use a lot of power. A normal laser pointer will hardly heat anything, and the "high-powered" lasers that are just a watt or two are no better then a magnifying glass and sun.

The best way to heat an experiment would be either to use a lens/mirror and the sun, or else have a heat source very close to the target. If you put your reaction container in a bigger insulated furnace (with heating filaments, an electric arc, fire, etc) then the container will get a lot of radiant heat even without any way of focusing it, because it's so close to the source. Also, there will be hot air and hot surfaces to further heat whatever you are trying to heat up by touching it. Finally, some of the heat that doesn't get absorbed by the "target," or that gets radiated back from it, can just be absorbed by the walls of the furnace, and conducted or radiated back.

Edit: [rquote] Why people have to wear face shields when working with torches? [rquote]
for one thing, there is very little atmosphere to filter UV light between your eyes and the torch (I'm assuming you're talking about welding or cutting metal) for another, you would also need some kind of face mask if you had a job involving looking straight at the sun. Intense light sources are focused to a tiny point by your eye lenses, and could get hot enough to burn the very sensitive tissues in the back of your eye. If your job involves welding, you have to look closely at what you're doing. To do that safely, a mask is in order. Also protects your face from drips of metal or oxides.

[rquote]you will imagine it as kW for one hour, as it's easy to convert.[rquote]
It looks like you've been studying energy and power in the last week or so. Keep up the good work; these principles will be very useful for understanding all kinds of scientific concepts:)

[Edited on 9-4-2018 by Vomaturge]

AJKOER - 10-4-2018 at 12:21

With respect to chemical activity induced by light, as opposed to heating by direct light, there is an interesting construct of so called 'diffused' sunlight (as opposed to direct sunlight) like what can occur on a cloudy day. See, for example, "TiO2-NiO p-n nanocomposite with enhanced sonophotocatalytic activity under diffused sunlight", abstract at https://www.ncbi.nlm.nih.gov/pubmed/26968646 and also "Diffuse Light for Better Plants" at https://www.growertalks.com/Article/?articleid=20729 .

[Edited on 10-4-2018 by AJKOER]