astro-cosmo-q 15 hours ago | next |

As someone who works decently close to, but not in, this area, I am surprised to see this on the front page of HN. The paper authors do not use correct statistical practices (e.g. H_0 cannot be fixed “as a nuisance parameter” to remove a degeneracy with another parameter - nuisance parameters must be marginalized over!) and the authors fail to account for several effects in their model (e.g. stretch/color factors for each supernova must be varied) which are known to be necessary for robust inference of cosmological parameters from supernovae data.

This is an honest question since I have seen this phenomenon occur a few times now with cosmology/astrophysics papers on HN: How did the original poster find this? And why has it gotten such interest/points? I sincerely hope it is simply a well-intentioned interest in our universe (which it greatly heartens me to see!) combined with naïveté (not meant pejoratively, just to refer to lacking context) wrt the technical nature of this work, but I am interested to hear your thoughts.

necubi 13 hours ago | root | parent | next |

I saw it floating around Twitter with some expansive commentary "This could be an incredible revolution in Cosmology. The Dark Energy model of the universe, which won a Nobel Prize in 2011, may be completely wrong.", etc.

For whatever reason, engineers really hate dark energy and will glom onto any fringe theory that appears to disprove it. Not to psychoanalyze too much, but it seems to be a topic where non-experts get to feel like they're smarter than those PhD cosmologists because they watched a Sabine Hossenfelder video.

See literally any thread about dark energy (or dark matter, which elicits similar reactions) on HN.

codethief 5 hours ago | root | parent | prev | next |

I can't speak to the technical nature of the work, as I don't work in the field, but the last-named author, likely the advisor, seems to be a respectable researcher.

> I have served on the committee of the International Society on General Relativity and Gravitation 2017-2022, and am a past President of the Australasian Society for General Relativity and Gravitation and a past President of the New Zealand Institute of Physics. I served 8 years on the editorial board of Classical and Quantum Gravity, 2012-2019.

http://www2.phys.canterbury.ac.nz/~dlw24/

So while your criticism may well be justified, you make it sound like this is a fringe paper, which I don't think is the case. So instead of opening a meta discussion about what physics papers get posted & upvoted on HN and why (which is hardly novel), I'd be much more interested in how big the issues you're mentioning are and whether the results of the paper could be salvageable. I at least do think that Wiltshire's research is interesting and he has made good points[0] about the challenges of coarse-graining spacetime structures and where & why the underlying assumptions of LCDM might fail to hold.

[0]: See e.g. the lecture notes on https://arxiv.org/abs/1311.3787

marcosdumay 13 hours ago | root | parent | prev | next |

That is probably the same phenomenon that makes the most sketchy papers on nutrition science to appear on popular news.

Sketchy things have the most interesting results. People that want entertaining news select for interesting results, and the sketchy ones get over-represented.

SpaceManNabs 10 hours ago | root | parent | prev | next |

I have mentioned this in two comments but HN gets really kooky when it comes to cosmology. A few years ago i saw a barrage of highly upvoted papers on MOND and de Broglie stuff.

It really made me wonder what else gets posted here that is patently absurd but i dont have the prerequisite knowledge to filter it.

andrewflnr 13 hours ago | root | parent | prev | next |

> well-intentioned interest in our universe

I mean, probably. Though HN does have a taste for fringe theories, which might color your interpretation of "well-intentioned". And most of us aren't really qualified to assess the statistical rigor of astrophysics papers, myself certainly included.

mmooss 12 hours ago | root | parent | prev |

> why has it gotten such interest/points?

For context: The top comment on most HN stories, especially research, tries to completely discredit the OP, often by finding flaws, and especially in statistical methods.

Everything has flaws. I think people are interested in what is valuable and possible. Shakespeare's work has many flaws, but that's not what people focus on.

Also, while you aren't responsible for all those other top comments, why should I believe yours? Usually I just ignore these comments (but I appreciate your curiosity).

JoeAltmaier 2 hours ago | root | parent |

Failures in statistical methods are not covered by 'everything has flaws'. Those are fatal, existential flaws. Something that is not likely true (though it is erroneously presented as likely of being true) are likely false.

molticrystal a day ago | prev | next |

This paper argues that the Timescape model [0] provides a better fit than the cold dark matter model when examining Type Ia Supernovae. According to the Timescape model, clocks run faster in voids where the gravitational field is less, and significant differences exist between a galaxy floating in a void and one like the Milky Way Galaxy. The Timescape model suggests that other models, which fail to account for these differences, lead to less accurate calculations and less plausible solutions.

[0] https://en.wikipedia.org/wiki/Inhomogeneous_cosmology?useski...

T-A 18 hours ago | root | parent | next |

> a better fit than the cold dark matter model

than the Lambda Cold Dark Matter model.

Lambda, i.e. the cosmological constant, a.k.a. dark energy, is what they do away with, not dark matter.

SpaceManNabs 16 hours ago | root | parent |

Thanks for saving me time in dismissing this paper lol. Any time somebody wants to get rid of dark energy, i run into some garbage. Reminds me of the mond nuts

Just reading the rest of the comment section is enough to help me verify that.

For some reason, hackernews always gets kooky when it comes to this stuff.

andrewflnr 13 hours ago | root | parent | prev |

I don't know, the evidence for dark energy has always seemed a lot sketchier than the evidence for dark matter. Dark matter has lots of interlocking lines of evidence. Isn't dark energy pretty much entirely based on various cosmic distance measures that all have huge stacks of assumptions embedded?

SpaceManNabs 12 hours ago | root | parent |

I agree. Until i see better evidence for 1a, wmap, and cluster formation in another theory, i really want all the charlatans to be quiet. We dont know what dark energy is, but we have decent evidence to say it is there and also decent theory.

I am not saying this paper is made by charlatan btw. This type of work attracts those people though.

vlovich123 a day ago | root | parent | prev | next |

If clocks run slower in the presence of gravity, wouldn’t it stand to reason it runs more quickly in a void where there’s less gravity? Or is the model saying that clocks run even faster in a void than Einstein’s theory predicts?

vecter 21 hours ago | root | parent |

Clocks run at "normal" speed (i.e. "1x" speed) in the absence of a gravitational field. The stronger the gravity, the slower they run (i.e. less than "1x" speed).

vlovich123 21 hours ago | root | parent |

Right so is the paper saying that lambda CM completely ignored clock differences due to heterogeneity in mass distribution in the universe where isolated galaxies would be experiencing less time slowing than galaxies near other galaxies which would experience more time dilation?

raattgift 24 minutes ago | root | parent |

In the standard cosmology the Integrated Sachs-Wolfe effect captures the redshift/blueshift of distant light sources (up to the Cosmic Microwave Background) as it traverses relatively dense regions and relative voids.

https://en.wikipedia.org/wiki/Sachs%E2%80%93Wolfe_effect

Note that in the next paragraph I depart significantly from the vocabulary that the Timescapes programme proponents have been using for the past twenty years.

ISW and comparable spectroscopy is easy enough to think about in terms of an accelerating cosmic expansion, i.e., relative voids are becoming spatially bigger with the expansion. It becomes much less intuitive how to fit the data if one keeps relative voids at roughly constant volume instead implying that there is a significant false vacuum above the ground state and in voids the false vacuum is slowly decaying to that state. (Outside the supervoids, near matter, this false vacuum decays much more slowly still). Because "vacuum" in the voids isn't really vacuum, one is stuck with a running function on the constant c (it gets faster with time from the formation of the CMB; this is because the false vacuum evolves towards a real vacuum) or adapting lightlike geodesics by imposing refraction (since the false vacuum is a medium).

The usual terminology is reasonably capture in the first paragraph here at <https://en.wikipedia.org/wiki/Inhomogeneous_cosmology#Inhomo...> ("Inhomogeneous universe"). The following short section ("Perturbative approach") is what is done in the standard cosmology when one wants to do detailed studies of filamentary distributions and other structures that are lumpy at some (larrrrrge) length scale of interest: the perturbed homogenous background is practically always the standard FLRW.

The justification for perturbation theory on FLRW is that even though there are dense spots (notably most galaxies' central black holes), principles like the Birkhoff theorem capture the idea that as you get far enough away from a galaxy it behaves more and more like a small shell, and this happens at intragalactic scales for these SMBHs: gravitationally, even to its arms' structure, it makes practically no difference whether Andromeda's central bulge has a lot more stars/gas/dust or whether it has one, two, or six central SMBHs (at enough spatial separation that they're not mutually orbiting in a way that would generate gravitational radiation our observatories are sensitive to).

The same idea applies to galaxies->galaxy clusters->filamentary structures: as you "zoom out" the density variations become less important: filaments are pretty sparse on average.

The Timescapes programe wants a sharper difference in matter sparseness between voids and filaments, and proposes that gravitational backreaction by the matter is responsible for generating that: the presence of matter steepens the density of matter over time (without the visible matter clearly becoming denser). I don't personally see how that's much different from a false-vacuum decay in the voids, conceptually.

Finally, I think the most important result of this latest Timescapes paper is a reminder to everyone that supernova data are a mess. A good X-mas present would be a couple readily visible Milky Way supernovae.

brotchie 19 hours ago | root | parent | prev |

Pet theory is that our universe is run on some external computational substrate. A lot of the strangeness we see in quantum physics are side effects of how that computation is executed efficiently.

The inability to reconcile quantum field theory and general relativity is the that gravity is a fundamentally different thing to matter: matter is an information system that's run to execute the laws of physics, gravity is a side effect of the underlying architecture being parallelized across many compute nodes.

The speed of light limitation is the side-effect of it taking a finite time for information to propagate in the underlying computational substrate.

The top-level calculation the universe is running is constantly trying to balance computation efficiently among the compute nodes in the substrate: e.g. the universe is trying to maintain a constant complexity density across all compute nodes.

Black holes act as complexity sinks, effectively "garbage collection." The matter than falls below the event horizon is effectively removed from the computation needs of the substrate. The cosmological constant can be explained by more compute power being available as more and more matter is consumed by black holes.

This can be introduced into GR by adding a new scalar field whose distribution encodes "complexity density." e.g. some metric of complexity like counting micro-states, etc. This scalar field attempts to remain spatially uniform in order to best "smooth" computation across the computational substrate. If you apply this to a galaxy with a large central supermassive black hole, you end up with almost a point sink of complexity at the center, then a large area of high complexity in the accretion disk, and then a gradient of complexity away towards the edges of the galaxy. That is, the scalar field has strong gradients along the radius of the galaxy, and this gives rise to varying gravitational effects over the radius (very MOND-like).

Some back of the napkin calculations show that adding this complexity density scalar field to GR does replicate observed rotation curves of galaxies. Would love to formalize this and run some numerical simulations.

Would hope that fitting the free parameters of GR with this complexity density scalar field would yield some testable predictions that differ from current naive assumptions around dark matter and dark energy.

fsloth 17 hours ago | root | parent | next |

”External computation susbtrate” is a useful idea if it leads to falsifiable theories. As a ”theory of everything” it sucks because it’s clearly not motivated by any specific maths or observations, but by the human need to map nature into some comprehensible analogue. Ie. taking some simpler subset of nature and trying to pretend the rest of it is like that as well. Usually nature so far has become more incomprehensible the deeper we’ve looked at it.

Newtonian mechanics & mechanical clocks being hottest precision technique led scientists at the time to viewing nature as a clockwork. Now we have computers, we think ”nature is like computers” because it’s an appealing analogue.

But it’s a false analogue imo. Just like clocks are a thing enabled by nature (a subset, in every meaning of the word) similarly computers are a subset of nature. So yes, nature can think (with human brains) and nature can run computations (with cpu:s impregnated with programs) but that also is just a subset of nature.

Now: games of the mind and helpfull analogues rock. And asking ”how is nature analogous to a turing machine” is interesting for sure. But just because a game is fun or analogue appealing, should not one let forget in the philosophical sense that one is playing only with a limited subset of a thing.

PaulHoule a day ago | prev | next |

There's been a general problem in astronomy for a long time that it seems like there just hasn't been enough time for objects to develop

The oldest version of this I know of can be seen in a diagram of ways that large black holes could possibly form in this book

https://en.wikipedia.org/wiki/Gravitation_(book)

which shows as early in 1973 people knew they had no idea how supermassive black holes could possibly form. Lately these problems have intensified because Webb seems to see that all sorts of developments seemed to happen a lot more quickly than they should of which leaves one wondering if the first billion years were really the first ten billion years. Could Timescape explain that?

api a day ago | root | parent |

AFAIK one possible explanation for the black hole issues could be primordial black holes, which are also a candidate for at least a component of dark matter.

PaulHoule 20 hours ago | root | parent |

Yep. There is the idea that you could get little primordial black holes (that maybe weigh as much as a mountain and could be evaporating now) and the idea that you could get huge primordial black holes. Also the occasional strange idea that the universe might be cyclic (not too fashionable but can fill the hole left by inflation) and that black holes can survive the crunch.

numpy-thagoras 20 hours ago | root | parent | next |

Black holes can survive a Big Crunch scenario? That can go a long way to explaining many things. Can you please provide a paper with more references to this, and potentially one with an example mechanism?

api 4 hours ago | root | parent | prev |

I love the idea. It’s one of our current physics hypotheses I hope is true, because it means the universe would be full of tiny things the size of a hydrogen atom with the mass of asteroids.

“The devil’s glitter?”

BTW they would not suck up planets and stuff like inaccurate sci-fi. One could be going real fast and fire right though the Earth and do little, maybe cause some seismic events, but we would never know unless we knew exactly what to look for. A tiny black hole would have a tiny event horizon.

If you dropped one into a planet or star with a low enough velocity that it didn’t shoot out the other side it might do a lot of damage, then come to rest in the middle and slowly grow. I recall reading that one in Earth’s core would take possibly millions of years to do much since the radiation pressure caused by accretion around it would limit the rate of matter falling into it. Earth would eventually become an Earth mass black hole but it would not happen in any human lifetime, possibly not in the lifetime of the human race.

tigerlily 21 hours ago | prev | next |

This is the opening salvo in cosmology's Battle of Trafalgar. Dave Wiltshire has lined up a set piece 20 years in the making that is going to obliterate both lambda CDM and MOND and all the rest.

dcsommer 20 hours ago | root | parent |

Sounds fascinating. Anywhere I can read more about the build-up to this moment? Has David Wiltshire written about this?

ajross a day ago | prev | next |

Webb is turning out to be one of the most impactful pieces of scientific apparatus of the last century or so. Not that it took all the relevant data, but that it was the final thing that broke open all the doors being held shut. We're watching a Kuhnian paradigm shift in astronomy unfold in real time.

epicureanideal 21 hours ago | root | parent |

I’ll be happy to see all the dark matter, dark energy stuff explained away.

SpaceManNabs 16 hours ago | root | parent |

We have a century's worth of evidence for dark matter and about 20 years worth of evidence for dark energy.

Once an alternative theory stands up to scrutiny, maybe we shouldnt a priori dismiss things we dont understand?

api a day ago | prev | next |

Good wikipedia article on these types of cosmologies including timescape cosmology:

https://en.wikipedia.org/wiki/Inhomogeneous_cosmology

numpy-thagoras 19 hours ago | root | parent |

Many advancements in science have happened because we stopped for a second, and then looked to generalize our assumptions. Consider,

e.g.

Euclidean geometry -> non-euclidean geometry; Classical analysis -> nonstandard analysis; Linearity -> non-linearity; Homogeneity -> inhomogeneity; Flat spacetime -> curved spacetime; Singular probabilities -> superposition.

All of these were loosening of certain criteria that opened up many possibilities. It is certainly erroneous to assume we must, by necessity, have a homogeneous cosmology.

scrubs a day ago | prev | next |

I'm surprised cosmology hasn't accounted for differences in clocks given how central GR is to astronomy. Granted I am no expert, but adding this dynamic was, until today, a bridge too far, or thought to average out somehow and not be pertinent

codethief 20 hours ago | root | parent | next |

> cosmology hasn't accounted for differences in clocks given how central GR is to astronomy

Of course it has. Yes, LCDM's FLRW metric, by its defining assumption of spatial homogeneity, doesn't allow the metric (let alone the speed of clocks) to vary spatially. However, it is very common to do perturbation theory on top of the FLRW metric to account for density fluctuations. Besides, there are also models like LTB (Lemaître-Tolman-Bondi) which give up on homogeneity at the non-perturbative level (while still preserving isotropy, though).

All in all, the idea that local voids could explain away the Lambda in LCDM is anything but new. It's just that the OP's timescape approach is the first one that seems to produce promising results. (Disclaimer: I merely skimmed the paper.)

throwaway290 14 hours ago | prev |

Why is it a "change"? We already have two cosmology models. This just gives one of them more support right?