skip0110

joined 1 year ago
[–] skip0110@lemm.ee 5 points 1 year ago

A great way is by charging for volume of trash produced. My city works that way (pay per bag) and we produce very little trash (sometimes not even filling a trash bag in one week). It also makes you really consider buying something when you include the potential cost of throwing it away, if it is not reusable.

[–] skip0110@lemm.ee 0 points 1 year ago

I don’t hate Google. But some of their services/products are more buggy then the competitors (gmail, chat, chrome) and some don’t have much utility (free form search for products or recommendations, maps) so I use the better competitor products, where it benefits me. And I use the Google product when it offers me a benefit (search for technical documentation or finding a specific URL, chrome devtools). In some cases I’m locked in (gmail) and in that respect, it’s frustrating (but not unique to Google)

[–] skip0110@lemm.ee 0 points 1 year ago (1 children)

What’s worse, the parking or the broad generalizations in the comments here?

[–] skip0110@lemm.ee 58 points 1 year ago (3 children)

5.25 billion smartphone users, so they are paying about $5 per user. If you switch the default from Google, you are taking $5 from them!

[–] skip0110@lemm.ee 1 points 1 year ago

I’ve had the same pair of Rockport boots for 20+ years.

[–] skip0110@lemm.ee 10 points 1 year ago

I understand where your coming from. If you are used to cooking “by the seat of your pants” for one scaling to a group is more complex than just increasing the amounts.

A couple things that can trip you up:

Prep: Bigger ingredient amounts mean you probably should prep them before starting. E.g I can peel and dice one potato in the time it takes water to come to a boil. 6 potatoes, not so much. Do a mise en place.

Seasoning: taste more often and consider aiming for a more “average” palette. E.g I like my food with very low salt but more pepper, but I don’t do this when cooking for others.

Pans: larger sizes mean you might have to do some steps in batches (browning) or use two pans where you could have used a single pan for one (e.g. split the pan and brown meat at the same time as cooking onions). Create pans/trays to hold the parts of the meal that are partially cooked. When making a lot of something, a little prep and organization makes things go smoothly since you might be repeating the same task several times, so if that task is a little quicker, you get a big benefit. Whereas you might not want the extra prep pans to wash when cooking for one, when cooking for more the better organization actually makes it go quicker.

You still can cook by taste/eye/instinct for the ingredients and amounts. It’s just that planning and organization becomes more important.

[–] skip0110@lemm.ee 1 points 1 year ago

It’s not so much the fake reviews, as the very honest and bluntly negative reviews. That gets in the way of Amazon making a cut off of sellers offloading useless crap onto their customers, and those are the reviews I think they will be purging.

[–] skip0110@lemm.ee 38 points 1 year ago (8 children)

How much disdain I have for change (“they are just making it worse!”) aka grumpy old man syndrome

[–] skip0110@lemm.ee 24 points 1 year ago (1 children)

If you login to the Gmail app on any device, it can also act as 2FA. Does not need to be the one where they send the push…any logged in device will work.

[–] skip0110@lemm.ee 14 points 1 year ago

Now I believe it more than ever

[–] skip0110@lemm.ee 35 points 1 year ago (14 children)

A malformed (attacker crafted) webp file could cause Chrome (or other Chrome based browsers) to execute arbitrary code when rendering it. The file might be embedded in a web page you view. Other applications that use Skia for graphics are theoretically affected too.

[–] skip0110@lemm.ee 24 points 1 year ago* (last edited 1 year ago) (2 children)

I think this model has billions of weights. So I believe that means the model itself is quite large. Since the receiver needs to already have this model, I’d suggest that rather than compressing the data, we have instead pre encoded it, embedded it in the model weights, and thus the “compression” is just basically passing a primary key that points to the data to be compressed in the model.

It’s like, if you already have a copy of a book, I can “compress” any text in that book into 2 numbers: a page offset, and a word offset on that page. But that’s cheating because, at some point, we had to transfer to book too!

view more: ‹ prev next ›