• 0 Posts
  • 87 Comments
Joined 1 year ago
cake
Cake day: August 20th, 2023

help-circle






  • Oh yes absolutely op’s x chromosome is expressed. I just meant unlike all the other chromosomes where in general both gene copies on both chromosomes are expressed, in xx individuals usually one of the x chromomes is inactivated and only one of them is being expressed at a time. The x chromosome has many essential genes. This is why we have x linked genetic diseases as well. Often xx individuals are just carriers or more mildly affected since they have two x chromosomes, and xy individuals are more severely affected since they have no backup copies of that gene.


  • Thank you for clarifying those misconceptions about what recessive and dominant are getting at. A gene isn’t really dominant or recessive. A phenotype (some trait in the organism like blue eyes or a certain disease) can be dominant or recessive though and results from changes in a gene. The same gene could have many different possible mutations, some with dominant effects, some with recessive effects, or some with no effects, depending on the change in the gene and the phenotype.

    To go further on that, many recessive diseases are because just one functional copy of many genes are fine from your body’s perspective. Many recessive diseases are due to loss of function of a gene or its protein product, a gene that for a variety of potential reasons no longer leads to a functional protein. Often your body can get by with just one working gene making protein, though both gene copies are generally always being transcribed and trying to be turned into functional protein.

    One big exception to this is the x chromosome. Males only have one x and have a y instead of a second x. The y is very tiny and has very few genes compared to the x, quite different from other chromosome pairs which generally just have copies of all the same genes on each other. Early in embryo development for xx individuals, one of the x chromosomes is generally inactivated and not expressed very much, otherwise xx individuals would have double the gene products of all those different genes compared to males, which the body is not expecting for x genes like it does for all the other genes that have a second copy.

    https://en.m.wikipedia.org/wiki/X-inactivation

    If you go even further you also get into the idea of penetrance. A gene codes for a protein, but that protein doesn’t exist in isolation, it interacts with lots of other proteins coded by other genes in the body, plus the environment. So for some genetic changes it might be a 100% chance at leading to a certain phenotype (like a disease or a specific trait), or it could be less, like only 70% or 30% chance or something of someone with that change getting that trait, even if it’s still “dominant” (meaning only one gene copy with that change is needed to express the trait).



  • That the eye can only perceive 24 fps is a myth. Plus vision perception is very complicated with many different processes, and your eyes and brain don’t strictly perceive things in frames per second. 24 fps is a relatively arbitrary number picked by the early movie industry, to make sure it would stay a good amount above 16 fps (below this you lose the persistence of motion illusion) without wasting too much more film, and is just a nice easily divisible number. The difference between higher frame rates is quite obvious. Just go grab any older pc game to make sure you can get a high frame rate, then cap it to not go higher to 24 after that, and the difference is night and day. Tons of people complaining about how much they hated the look of Hobbit movie with its 48 fps film can attest to this as well. You certainly do start to get some diminishing returns the higher fps you get though. Movies can be shot to deliberately avoid quick camera movements and other things that wouldn’t do well at 24 fps, but video games don’t always have that luxury. For an rpg or something sure 30 fps is probably fine. But fighting, action, racing, anything with a lot of movement or especially quick movements of the camera starts to feel pretty bad at 30 compared to 60.




  • Ranvier@sopuli.xyztoComic Strips@lemmy.worldXXX
    link
    fedilink
    arrow-up
    12
    arrow-down
    3
    ·
    edit-2
    8 months ago

    Depends on language and culture and context. In the United States we use America to refer to the country and North America and South America to refer to the continents. Many Latin American countries use a six continent system though, where North America and South America are just one continent called America. This can lead to some tension and confusion when people from the United States call themselves American, since that would imply everyone in the western hemisphere to them basically. While sometimes “Americano” is used to refer to people from the United States, you’ll also you get descriptors like “estadounidense” in Spanish for this reason. Though this also has ambiguity, since technically Mexico is also a “united states.”

    Anyways, point is, a seven continent system with the western hemisphere separated into north and south America isn’t used everywhere, for some people America is a continent. In some places Europe and Asia are combined, and there’s other variations too. None of them line up with plate tectonics or anything perfectly, so they’re all a little arbitrary in the end.

    https://en.m.wikipedia.org/wiki/Continent


  • This is bizarre, I looked and Rochester Minnesota has multiple high speed providers, including two that offer fiber.

    And the isp you have is a wireless isp that doesn’t even list Rochester as within its coverage area, they’re intended to serve more rural areas west of the city. On their map it gets close to but not quite in Rochester, but maybe they’re still able to access it (slowly) since it’s a wireless provider.

    I’m guessing this is a whoever owns your Airbnb problem rather than a Rochester Minnesota problem. I don’t understand why they would be paying for this rather than use any of the readily available high speed options there.



  • To add to this, most gpu reviews will now have two sets of benchmarks, one with ray tracing and one without. You can see the gap in raytracing performance at each price point narrowing considerably over the years as amd catches up. It also narrows further at higher resolutions (since the price equivalent amd options tend to have higher raw performance and more memory which becomes increasingly important at higher resolutions). Right now all else being equal at most price points you’ll see amd with a lead in non raytracing performance, and Nvidia with a lead in ray tracing performance. In addition to considering target resolution, which card is winning out can also be very variable per game, so if you have a particular game in mind, would see if there is a benchmark for that game so you would know what to expect with different cards and see what makes the most sense with your targeted performance, budget, and priorities.

    And to clarify for OP, when I say raytracing performance, I mean the fps with raytracing turned on. Visually it will appear the same in each particular game no matter what gpu you’re using, since it’s the game that implements the ray tracing. The one exception I know of in terms of actual quality right now is “ray reconstruction”, a part of dlss, that will only work on Nvidia chips, and that they claim improves the noise between individual rays better than traditional de noisiers through use of AI. Theoretically there should be other ways to reduce noise at a performance cost too, so in the end it does come down to performance and game by game implementation again. Not a lot of games with this right now, I think cyber punk, portal 1, and control.

    Especially since I use vr sometimes, I tend to favor the raw power at the price point more to get the best resolutions and frame rates. If you’re favoring just a great picture at lower resolutions like 1080p there starts to be diminishing returns (is 180 fps really a better experience than 120 fps?) in favoring non ray tracing performance, maybe making a less raw performance Nvidia card even more of a consideration if you feel the non raytracing performance is good enough. And then if money is no object of course, Nvidia has the best performing gpu overall in all aspects at the extreme high price end (4090), with no equivalent amd option at that level.

    Also dlss vs fsr needs to be considered. Fsr being not as far along as dlss. This would be more important at the lower end though (except in the case of ray reconstruction), higher end gpus likely won’t need to rely on these technologies to achieve good fps with current games. Hopefully fsr continues to improve and become a more widespread option. Amd is also working on fluid motion frames at the driver level, which may allow a similar effect to fsr 3 even if not implemented specifically by the game.