duncesplayed

joined 1 year ago
[–] duncesplayed@lemmy.one 1 points 1 year ago

Awful headline.

Somewhat surprising results, though. They took a fraction of pig blood plasma and injected it into rats over the course of 8 days. Some organs in the older rats showed a lower epigenetic age, and the older rats also performed quicker in cognitive tests. The results are more extreme than they predicted they be (especially the liver and heart), so we'll see what happens when someone tries to replicate the results.

Any speculation about applicability to humans is just science fiction, of course.

[–] duncesplayed@lemmy.one 0 points 1 year ago* (last edited 1 year ago)

Omegle is a bit of a unique case due to their persistent non-action. Most places, if people start grooming children or broadcasting child porn, they'll start banning offenders at the very lest. Omegle, nah.

At one point, they put a warning splash screen "Careful: there are pedophiles that use this" or something like that, but they took the warning down after a while. And eventually they did officially say that you can't use the site if you're a minor, but of course it was just enforced through the honour system.

Those are literally the only two actions they ever took to address criminal content and behaviour.

[–] duncesplayed@lemmy.one 2 points 1 year ago

Yup, mine, too. I don't remember which version it was, but I'm pretty sure it was still "Turbo" (not "Borland") Pascal, in the late 1990s. Grade 10 computer science was taught on Macintosh QuickBasic and then grades 11 and 12 were "real" programming in Turbo Pascal.

[–] duncesplayed@lemmy.one 5 points 1 year ago (1 children)

Yup, total bullshit. When I got to:

Kaufman hopes it will “transform how the medical community screens for diabetes”.

I started to lose faith that there was anything of interest there. For those who don't know, "how the medical community screens for diabetes" currently is to...draw blood. Like, that's literally it. You fast overnight, go to the doctor's office, get blood taken, and the next day you learn if you're diabetic. If your doctor is really fancy, they may do the thing where they take blood once, then ask you to drink some ungodly sickeningly sweet glucose potion and take blood a second time so they can see how your body responds. But that's about the extent of it.

The authors are making it sound like you currently have to hike through the Himalayas to get a diagnosis now. No, you just take blood. It's fast. It's cheap. It's easy. And it's just about 100% accurate.

I can see that something like this could come up in some niche situations where someone's very remote and it's better than nothing, but "transform how the medical community screens for diabetes" overall is pretty laughable.

[–] duncesplayed@lemmy.one 4 points 1 year ago

I, too, am curious if there's an advertising bubble. I hope so.

I've noticed something about my wife, though. She's not a "mindless capitalist zombie with the sole goal of owning more stuff", but she does pay attention to advertising a lot. We need more diapers? Well, it just so happens there's some new startup app that's advertising a free first month, so if she signs up for that up, we could get free diapers, and we'd only have to keep the membership for another two months, and they have deals on peanut butter, and we'd get access to their free streaming service and they have Disney, so it's probably worth it overall.

And so it goes, with a million of these deals. The thing is, each "deal" is so complicated that it's extremely difficult to know which ones we're actually saving money on. The cynical would say "you're never saving money: everything's rigged", but that's clearly not true. Some of these deals clearly do work out for us (and some of them cause the startup to immediately go bankrupt). But most of them aren't clearly better or worse for us: we'd have to spend several hours going through hypothetical scenarios to do the full CBA, which we don't do.

I do wonder, on balance, how much it's costing us. I also wonder how many of these deals are specifically (personally) targeted at my wife because they know what she needs and what her habits are.

[–] duncesplayed@lemmy.one 4 points 1 year ago* (last edited 1 year ago) (1 children)

They didn't "try": they did change the licence. From BSD+Patents to MIT. Hardly scandalous.

[–] duncesplayed@lemmy.one 5 points 1 year ago (1 children)

Facebook is a top 10 contributor to Linux. They are major developers for BtrFS and BPF and have contributed to a number of other kernel subsystems, too. Just Jens Axboe alone is a huge force in Linux.

Outside of Linux, they've created some pretty big open source projects, like React and Go Ent.

Honestly, they've open sourced almost everything they've ever done except for Facebook itself, and are one of the largest open source companies in the world.

[–] duncesplayed@lemmy.one 4 points 1 year ago

This is my one gripe with Debian's installer. I don't mind it setting defaults like 27G for / and 10G or whatever for /tmp. But I don't like that you can't stop it from allocating the entire volume. If it left a few hundred GB unallocated, then it would be trivial to expand whichever one you realize you need to expand later on.

As it is, if you want to give more room to one partition or another later on, you have to shrink /home first. If /home is ext4, that's inconvenient. If it's XFS, though, it's a nightmare.

[–] duncesplayed@lemmy.one 26 points 1 year ago

And not all GNU is Linux! Beyond the world famous GNU Hurd, there's also Debian GNU/kFreeBSD, and Nexenta (GNU/Illumos, which is the OpenSolaris kernel).

I think the most esoteric of them, though, is GNU Darwin (GNU/XNU). Darwin is the open source parts of OS X, including its kernel, XNU. There used to be an OpenDarwin project to try to turn Darwin into an actual independent operating system, but they failed, and were superseded by PureDarwin, which took a harder line against anything OS X getting into the system. GNU Darwin took it one step further and removed just about all of Darwin (except XNU) and replaced it with GNU instead.

[–] duncesplayed@lemmy.one 2 points 1 year ago* (last edited 1 year ago)

At a minimum they've got to design a wider issue. Current high-performance superscalar chips like the XuanTie 910 (what this laptop's SoC are built around) are only triple-issue (3-wide superscalar), which gives a theoretical maximum of 3 ipc per core. (And even by RISC standards, RISC-V has pretty "small" instructions, so 3 ipc isn't much compared to 3 ipc even on ARM. E.g., RISC-V does not have any comparison instructions, so comparisons need to be composed of at least a few more elementary instructions). As you widen the issue, that complicates the pipelining (and detecting pipeline hazards).

There's also some speculation that people are going to have to move to macro-op fusion, instead of implementing the ISA directly. I don't think anyone's actually done that in production yet (the macro-op fusion paper everyone links to was just one research project at a university and I haven't seen it done for real yet). If that happens, that's going to complicate the core design quite a lot.

None of these things are insurmountable. They just take people and time.

I suspect manufacturing is probably a big obstacle, too, but I know quite a bit less about that side of things. I mean a lot of companies are already fabbing RISC-V using modern transistor technologies.

[–] duncesplayed@lemmy.one 1 points 1 year ago (3 children)

It definitely could scale up. The question is who is willing to scale it up? It takes a lot less manpower, a lot less investment, and a lot less time to design a low-power core, which is why those have come to market first. Eventually someone's going to make a beast of a RISC-V core, though.

[–] duncesplayed@lemmy.one 6 points 1 year ago* (last edited 1 year ago) (1 children)

If you want a good CPU design with a 16-bit address space, take a look at the PDP-11.

Which was used in home computers, just not in the west

I agree with you, though. I'm kind of the prime market for this from an educational standpoint. My oldest kid has just learned to read and write (kind of). She's fascinated by computers. She's only played retrogames (happily) thus far, so she wouldn't be put off by the 8-bit era's graphics or sound.

But even so...what would I be hoping to teach her with this? How to work around the quirks of the 6502 that are not applicable to literally anything else? That life is full of unnecessary obstacles and frustration? That she could have learned more interesting programming in an easier way if I'd got her a computer with a flat memory model? I'm kind of meh on it.

view more: ‹ prev next ›