Jump to content

swansont

Moderators
  • Posts

    52929
  • Joined

  • Last visited

  • Days Won

    264

Everything posted by swansont

  1. As I said before, you approximate the sphere with something like a geodesic design. The calculations tell you how small the elements would have to be, and you get 32 surfaces (some pentagons, some hexagons)
  2. Yeah. I've read something similar but can't find it. I'm not a GR person, but I see two immediate problems with the post, besides some awkward wording: a ball that has some radius is inconsistent with "an infinitesimally small (in spacetime) freely falling frame." You can only have infinitesimal objects in such a frame. I also don't see how you get two objects at rest with respect to each other in one frame moving in opposite directions when you move to another frame. If the issue is that the ball can't possibly be moving anywhere but toward the black hole in any frame, one has to wonder how the particle came to be at the event horizon and with a radial outward velocity sufficient to escape, but not having been at r-dr the instant before, which would be inside the event horizon, or not undergoing an acceleration to change its direction, meaning it's not in an inertial frame. Invalid premises necessarily lead to invalid conclusions.
  3. But there are 8 surfaces at 4 square feet each, not one at 32 square feet.. What one should probably do is take this calculation and invert it, so to speak. Using the strength of the materials, find the size of surface that will hold up. A similar volume in a spherical shape will have a radius of about a meter, and a surface area of around 12.5 m^2, vs. 14.6 m^2 for the rectangular prism.
  4. If it isn't already gravitationally bound, how would it become so?
  5. Zeno didn't have the concept of infinitesimals available to him, and didn't know that an infinite series can converge.
  6. The NASA data I posted earlier shows between 0.15 ºC and 0.2 ºC for the last decade graphed. Exactly what you stated it should for CO2. It shows about that range for the previous two decades as well.
  7. I went through the math last time you posted that number and got ~4400 dps for K-40 (vs ~3000 for C-14)
  8. That would be because K-40 has about 50% more activity in your body, and a decay energy around 10x higher than C-14.
  9. Choosing 1998 as one endpoint is cherry-picking, as it was an anomalously high value. Choose 1997 or 1999 and re-do the analysis, and see what you get. One year shouldn't make much of a difference, should it? In fact, one really should use 1996-1997, as that's the low spot of the previous solar cycle. You get a ~0.2 ºC increase if you do that. And note that this is the variability over a single cycle, which is a different argument than the longer, multi-cycle trends, which were the topic of previous discussions.
  10. Most fusion initiatives have promised results in "about a decade." How is this one any different, other than being led by the author of "The Big Bang Never Happened?" A lot of bells go off in going into the details of this. The web site expends effort differentiating it from cold fusion and ZPE. The billion-degree effort is called "Lerner Theory" — an apparently self-named process. The papers listed are in ArXiv, but the citations for peer-reviewed literature are absent so far as I can see. This prediction of $0.002 / kWh, before a working device has even been built — remember, fission was going to be "too cheap to meter." Sorry, but absent any substantive science and engineering to support this, it registers too high on the BS-meter for me to take it seriously at this time. Feel free to buttress your claims.
  11. One thing is that the scientists, in general, aren't the ones simply extrapolating based on the data points on the graph. That's either an error or manipulation done by people with an agenda — choosing a smaller set of data and using it to support your point. (which is a political tactic rather than a scientific one) Two lessons from this are: don't rely on journalists for accurate scientific reporting, and policy-makers need access to impartial scientific analysis, like was provided by the Office of Technology Assessment, which was disbanded in 1995. Which prompts the question: is the IPCC an impartial body? And when one answers, one should justify that answer with some substantive support.
  12. If you mean a tiny change in the data is giving a completely different political result, I agree. It gives those with an agenda an opportunity to trumpet out exaggerations and falsehoods based on misrepresentation of the data and events. You've given several examples in this very thread. (mistaking data from the USA as being representative for the world, exaggerating the magnitude of an error, questioning the validity of the science based on that error, etc.). People who trumpeted 1998 as being anything more than an anomalously high data point in an increasing trend are guilty of similar political manipulation. One expects data to be scattered in scientific analysis, and also expect the occasional data point to deviate by a large amount (i.e. several standard deviations). However, if you check the scientific sources, I'll bet you see trendlines that average the data over some period, to smooth out the fluctuations. That's one way to tell the difference between scientific presentation of the data and someone attempting to manipulate it for political reasons. If the data are cherry-picked, it's often an indication of the latter. An extrapolation using only ~1992 through 1998 to predict massive temperature increases would be guilty if the same error. But I haven't run across any; I'd be interested to see them. OTOH, I've seen several extrapolations using 1998 as an endpoint, purportedly showing a subsequent downtrend.
  13. I recall reading that certain recently-living organisms that were collected near a large fossil-fuel-burning source were dated (IIRC it was grass near a major highway). It gave the wrong values, because much of the carbon was very old and it skewed the results. So, in principle, the answer is yes. If we double the amount of CO2 in the atmosphere from fossil fuel burning, we can expect the concentration of C-14 to drop in half, on average, and that would hold true for those organisms in equilibrium with the atmosphere.
  14. That's not what I said. The unit systems are manmade. I don't understand the meaning of "without the need to be a certain length" Is the distance between them a physical object? Can I measure the distance a satellite travels? Is that distance a physical thing? (my answers to these are no, yes, no)
  15. That's the basic conceptual obstacle with which you keep colliding. There is no inherent length of an object — it is a function of your reference frame. What you are describing is called the "proper length" of the object — the length measured in its rest frame. The thing is, if you want to do any useful physics, you have to use the contracted length and dilated time for a moving object. If you try and use the proper length and proper time, you get the wrong answer. If we assume that the laws of physics are the same in all frames, the contraction and dilation have to be real.
  16. Given that bascule has stated an opinion to the contrary, this is clearly not an absolute truth. How about a completely related citation for this, as it is a purely scientific claim? NASA data seems to disagree
  17. I think you've misread the claims. The mini black holes that are moving quickly (assuming they are created) are from cosmic rays striking the fixed earth. At CERN the rest frame is the center-of-momentum frame, so products are indeed produced with zero net momentum. (mod note: post moved from separate thread with similar topic)
  18. What are the answers you've come up with?
  19. The advantage of the quasi-sphere is that internal support is all compression if it goes along the diameter, which is going to be a lot more robust. With a cube, there will be lateral forces as well, which may be why a joint failed.
  20. Meltdown-proof and fireproof are two separate issues. Meltdown-proof implies that the decay heat (energy from the decay of the fission products) is insufficient to melt the fuel elements. Fireproof means just that, and is protection against particulates from actual combustion. The ceramic coating is presumably not combustable, so as long as their integrity is not compromised you won't get a release of contamination. The answer to that is yes, but in a trivial fashion. He-4 can very occasionally absorb a neutron to become He-5, which decays ... by neutron emission. And quite rapidly. It's just that in general it's probably better to not mix nuclear- and chemical-reaction terminology.
  21. Where did the structure fail? My gut tells me you'd do better with triangles in a polyhedron shape, like a d20, or hexagons like a soccer ball.
  22. That's mainly Tritium, though, right? Your link does not explain this at all, it just states that "Dramatically lower high-level radioactive waste per unit of energy – today’s reactors produce 50% more high-level waste than will the GT-MHR" so it's talking about high-level waste, not tritiated water, which you agree is low-level waste. So I ask again, how is less high-level waste being achieved? Is this simply an artifact of having a higher thermal efficiency, so there is less waste per MW-hr of operation. The numbers seem to indicate this (thermal efficiency and waste improvements being basically identical)? (and typically the phrase "inert gas" refers to Helium's chemical properties. Inert gases in general can most certainly become radioactive. Chemical inertness does not imply nuclear stability)
  23. Fish are probably not going to fly, as it were, if you come close to draining the lake. They also don't react well with turbines. Energy differential can be quite high in the right locations. Pump at night, drain during the day. As to the OP, I'm not sure that 10m is sufficient; there will be pressure requirements to turn the turbine, and there will be flow energy at the end, so you won't extract all of potential energy from the water. As Mr Skeptic points out here, a larger h gives more energy, so if the losses are a fixed value, you become more efficient as h goes up.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.