• 0 Posts
  • 10 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle

  • h34d@feddit.detoProgrammer Humor@programming.dev5/5 stars
    link
    fedilink
    English
    arrow-up
    17
    ·
    9 months ago

    Dev Home is a new control center for Windows providing the ability to monitor projects in your dashboard using customizable widgets, set up your dev environment by downloading apps, packages, or repositories, connect to your developer accounts and tools (such as GitHub), and create a Dev Drive for storage all in one place.

    • Use the centralized dashboard with customizable widgets to monitor workflows, track your dev projects, coding tasks, GitHub issues, pull requests, available SSH connections, and system CPU, GPU, Memory, and Network performance.
    • Use the Machine configuration tool to set up your development environment on a new device or onboard a new dev project.
    • Use Dev Home extensions to set up widgets that display developer-specific information. Create and share your own custom-built extensions.
    • Create a Dev Drive to store your project files and Git repositories.

    https://learn.microsoft.com/en-gb/windows/dev-home/




  • Both the popular article linked in the op as well as the actual paper seem to use the terms “liberal/conservative” and “leftist/rightist” interchangeably. Quote from the paper:

    It is necessary to note that, first, similar to previous studies on this topic that consider the left–right dimension equivalent to the liberal–conservative dimension (Fuchs and Klingemann, 1990; Hasson et al., 2018), throughout this paper, the terms leftist and liberal (and similarly, rightist and conservative) were used interchangeably. The liberal–conservative dimension is often used in the United States, whereas the left–right dimension is commonly used in Europe and Israel (Hasson et al., 2018).

    There were “only” 55 participants, but I assume that if some of them identified as socialist, they would already be included under “leftist/liberal” for the purpose of the study.


  • Thanks for giving additional explanation. I was trying to keep my reply relatively short and agree with most of what you said.

    Although the article is behind a paywall (which is somewhat strange in cosmology, but I digress), you can check other articles by the same author that also use the “varying constants” framework, for example https://arxiv.org/abs/2201.11667. His framework is that the speed of light c, the Planck constant h, the Boltzmann constant k and the Gravitational constant G depend directly on time, or to be more precise, on the expansion factor of the universe.

    Thanks for the arxiv link. I was aware that some people did stuff like this (time-varying fundamental constants), but the abstract only speaking of “coupling constants” made me think of Λ (and G), not fundamental constants. There are some theories that motivate a varying speed of light, for example (Hořava–Lifshitz gravity comes to mind), but this doesn’t seem to be motivated by any theory in particular, as far as I can tell. I also agree with you that it seems quite weird to give c, h, and k a time dependence each, only to then have them all be functions of G.

    Since this is a time-dependent change, there is no real way to significantly test the hypothesis (unlike the energy-dependent changes).

    I’m not sure if I fully agree with this. Shouldn’t varying c, h, and k with time clearly change any observable related to the dispersion of light and gravitational waves, or black body radiation (among many other things)? And if we had access to even just one of those from different times during cosmological evolution (where the change should be much larger than between a few decades in the present), we should in principle be able to check if the proposed scaling law holds quite easily. Of course, the author could always make the variation with time small enough to avoid contradicting experiment (which would make it indeed unfalsifiable in practice), but that seems to go against the main idea of using these time-varying fundamental constants to explain some aspects of cosmological evolution. My guess now would be that the paywalled paper modifies the relation between redshift and time to undo the “damage” done by modifying the constants. Nevertheless, it wouldn’t surprise me much if this kind of scaling is already ruled out implicitly by some data, as I can’t imagine it not affecting a lot of different observables, but maybe I’m also overestimating the experimental cosmological data available at present, or the strength of the variance the author proposes.


  • According to my understanding, yes. For example, it is usually assumed that there was a period of time shortly after inflation when matter was in a quark-gluon plasma, which would imply a larger strong coupling than today, since a small strong coupling is associated to confinement. There was also the electroweak-epoch, during which the electromagnetic and weak interactions were unified, and the corresponding gauge bosons were massless. The masses of the W and Z bosons can thus also be regarded as time-varying, as well as the electron charge. However, it should be noted that these changes are not all that significant on the cosmological scales under investigation here (e.g. the quark epoch ended at about 10-6 seconds after the big bang, which is much much less than the age of the universe, and it’s assumed that it still took quite a while before the first stars formed). A time-varying cosmological constant could potentially be much more relevant (and some quantum gravity theories even predict it), and I’ve heard some people suggesting it as a potential solution for the H0 tension. However, I unfortunately can’t access the paper and assess what precisely the author did there, and whether it is in any way similar to what I just mentioned.


  • He says he has a new way of describing light where it loses energy over time (something weird) and so it explains redshift.

    From what I understand, the main idea behind tired light isn’t particularly weird, it’s just that scattering could potentially lead to a redshift as well. The issue is that if you assume enough scattering to explain cosmological redshift you would also get some other effects, which are however not observed. This basically ruled out the original tired light theory by Zwicky from the beginning. The author of this paper seems to try to get around that by combining a smaller amount of “tired light” with time-varying couplings. Unfortunately the paper is behind a paywall and I can’t tell any more details.

    He also says universal constants can change (something never observed before that would fundamentally change physics)

    No, he says that coupling constants (not sure if that is what you mean by “universal constants” or not) can change, which is a generic consequence of the RG and has in fact been observed in nature (e.g. electron charge or strong coupling, to name just the most famous examples). From a QFT perspective, the cosmological constant is also a coupling, and several quantum gravity theories do in fact generically predict or suggest a time-varying cosmological constant. So this part by itself isn’t really that out there, nor that original for that matter. However, since I can’t access the paper I can’t judge whether the author’s way of varying Λ is reasonable or just a way to fit the data without any physical motivation, and I don’t really know what the article means by “he proposes a constant that accounts for the evolution of the coupling constants”.

    and he can explain dark matter

    That seems like a more grandiose claim to me, if accurate. Do you have a source for where the author claims that? Although he wouldn’t be the first to do so.

    I’m pretty sure this guy isn’t toppling physics today as the bar is set high for whatever evidence he is sharing.

    I think this can be said for a lot of popular science article with topics like this. However, in many cases the blame can lie more with the pop-sci journalists who are looking for a cool story and might over-interpret the author’s claims (I guess “physics toppled!!!11” sounds more interesting than “some guy suggests that some data might be fitted in a slightly different way”). Although in this case at least the age of the universe claim does seem to come from the author.

    Edit: Judging by another article of the author someone else linked me to further down, it seems that while the author does speak of coupling constants, he really does refer to time-varying fundamental constants. So I must agree with the previous poster on this, it does seem quite a bit more out there than I had originally assumed.



  • While it is true that “should of” etc. can easily originate from a confusion between “‘ve’” and unstressed “of”, which sound identical, the statement

    “Should of” is incorrect

    itself is at least a bit misleading and prescriptivist in its generality.

    Interestingly, there seem to be at least some native English speakers who genuinely do say “should of” (with a stressed “of”) sometimes. This paper for example argues that people who say “should of” really do use a grammatical construction of the form modal verb + of + past participle. One argument the author mentions is that this would also explain the words “woulda”, “coulda” and “shoulda”, since “of”->“a” is quite common in general (e.g. “kind of” -> “kinda”), but “'ve”->“a” basically doesn’t occur elsewhere (e.g. no one says “I’a” or “you’a” instead of “I’ve” or “you’ve”). Another is that the reverse mistake, i.e. using “‘ve’” in place of “of” (e.g. “kind’ve”), is much rarer, which is a clear difference to e.g. the situation with “they’re”/“their”/“there”, where people use these words in place of the others in all combinations frequently. I recommend this blog article for a much longer discussion.

    Also, whether genuine mistake (which it almost certainly is in many cases, although probably not all) or different grammatical construction, YSK that “should of” etc. didn’t just become popular recently, but have been used for centuries. E.g. John Keats wrote in a letter in 1814: “Had I known of your illness I should not of written in such fiery phrase in my first Letter.”. Many more examples (some older as well) can be found e.g. here or here.

    TL;DR: While in many cases “should of” etc. can well be a mistake, originating from the fact that it sounds identical to “should’ve” when unstressed, there is some interesting linguistic evidence that at least in some dialects of English native speakers really do say “should of” etc. (i.e. in those cases it is not a mistake, merely non-standard/dialectal).