megaman

joined 1 year ago
[–] megaman@discuss.tchncs.de 3 points 1 day ago* (last edited 1 day ago)

4:45pm friday news cycle

[–] megaman@discuss.tchncs.de 26 points 3 days ago (3 children)

If you arent an actual journalist who is being personally, specifically hunted then you probably don't need to take the same precautions as one.

And yea, the guide boils down to "none of these things are 100% safe but they are realistic things you can do that can offer more protection than not doing them."

Your skimming of the article missed how they do indeed talk about the shortcomings of every suggestion they have. For example, the article also does indeed talk about how you can turn off gps but your phone will still ping towers revealing your location, and goes on to say that you can put your phone in a faraday bag but that isnt practical for most people but is indeed an option if you want to do it.

[–] megaman@discuss.tchncs.de 3 points 4 days ago

I use the parental controls on the router to put the roomba in grounded-child mode.

That said, I'm not actually positive it works... it is able to connect to home assistant, so it definitely has local network connectivity, but I haven't proved to myself that it is actually unable to connect to its remote servers since it isn't really that big of a deal to me.

[–] megaman@discuss.tchncs.de 3 points 1 week ago

The git repo should ignore the venv folder, so when you clone you then create a new one and activate it with those steps.

Then when you are installing requirements with pip, the repo you cloned will likely have a requirements.txt file in it, so you 'pip install -r requirements.txt'

[–] megaman@discuss.tchncs.de 1 points 2 weeks ago

it is indeed infrequent, but the modern world has trained me to expect convenience and instant-ness. Last time i wanted a 12-year-old email I was in the car with friends and and to pull it up. it wasn't anything important at all, to be clear, but i'm hoping to search my 12-year-old emails with the same convenience as last month's.

[–] megaman@discuss.tchncs.de 2 points 2 weeks ago

I think that that is right that I fundamentally want an archive, not what a normal mail server provides. Part of my thought on looking at mail servers is that those would integrate directly with whatever other front-end/client that I'd normally use, whereas an archive maybe would not.

And regarding archive-specific stuff, I am seeing some things on a search, but I guess i'm wondering if folks here have any recommendations. When I look at , for example, nothing comes up for email archive, just for email servers. That, plus what I see when searching, makes me think that the archive-specific stuff is either oriented to business or oriented to a CLI (like NotMuch, which was mentioned in the discussion here and does look cool).

[–] megaman@discuss.tchncs.de 2 points 2 weeks ago (2 children)

This looks like a good backend for sure, but the web frontends look a little lacking and I'm not seeing anything about a mobile frontend (other than if a web one was up, which would be fine). Have you tried any of the web frontends?

 

Yet another question about self-hosting email, but I haven't found the answer at least phrased in a way that makes sense with my question.

I've got ~15 GBs of old gmail data that I've already downloaded, and google is on my ass about "91% full" and we know I'm not about to pay them for storage (I'll sooner spend 100 hours trying to solve it myself before I pay them $3/month).

What I want is to have the same (or relatively close to the same) access and experience to find stuff in those old emails when they are stored on my hardware as I do when they are in my gmail. That is, I want to have a website and/or app that i search for emails from so-and-so, in some date-range, keywords. I don't actually want to send any emails from this server or receive anything to it (maybe I would want gmail to forward to it or something, but probably I'd just do another archive batch every year).

What I've tried so far, which is sort of working, is that I've set up docker-mailserver on my box, and that is working and accessible. I can connect to it via Thunderbird or K-9 mail. I also converted big email download from google, which was a .mbox, into maildir using mb2md (apt install mb2md on debian was nice). This gave me a directory with ~120k individual email files.

When I check this out in Thunderbird, I see all those emails (and they look like they have the right info) (as a side - I actually only moved 1k emails into the directory that docker-mailserver has access to, just for testing, and Thunderbird only sees that 1k then). I can do some searching on those.

When I open in K-9, it by default looks like it just pulls in 100 of them. I can pull in more or refresh it sort of thing. I don't normally use K-9, so I may just be missing how the functionality there is supposed to work.

I also just tried connecting to the mail server with Nextcloud Mail, which works in the sense that it connects but it (1) seems like it is struggling, and (2) is putting 'today' as the date for all the emails rather than when they actually came through. I don't really want to use Nextcloud Mail here...

So, I think my question here is now really around search and storage. In Thunderbird, I think that the way it works (I don't normally use Thunderbird much either) is that it downloads all the files locally, and then it will search them locally. In K-9 that appears to be the same, but with the caveat that it doesn't look like it really wants to download 120k emails locally (even if I can).

What I think I want to do, though, is have the search running on the server. Like I don't want to download 15GBs (and another 9 from gmail soon enough) to each client. I want it all on the server and just put in my search and the server do the query and give me a response.

docker-mailserver has a page for setting up Full-Text Search with Xapian, where it'll make all the indices and all that. I tinkered with this and think I got it set up. This is another sort of thing where I would want the search to be utilizing the server rather than client since the server is (hopefully) optimizing for some of this stuff.

Should I be using a different server for what I want here? I've poked around at different ones and am more than open to changing to something else that is more for what I need here.

For clients, should I be using Roundcube or something else? Will that actually help with this 'use the server to search' question? For mobile, is there any way to avoid downloading all the emails to the client?

Thanks for the help.

[–] megaman@discuss.tchncs.de 1 points 1 month ago

This article isnt about how emails associated with logins got released in a breach, but that documents that are uploaded to the archive are stamped with the email address of the account that uploaded it and that can be viewed by anyone who downloads the document.

So in standard, everyday use of the site, email addresses are being revealed and are associated with the actions of that person. Like if I upload a copy of the manual for my washing machine or something, which is a more benign example, my email is linked to that document now.

Then combine this with (1) the internet archive says in multiple spots that they dont reveal this info anywhere, and (2) the issue has been raised to the organization, and it becomes more of a specific negligence from them.

[–] megaman@discuss.tchncs.de 4 points 1 month ago

This article isnt about how emails associated with logins got released in a breach, but that documents that are uploaded to the archive are stamped with the email address of the account that uploaded it and that can be viewed by anyone who downloads the document.

So in standard, everyday use of the site, email addresses are being revealed and are associated with the actions of that person. Like if I upload a copy of the manual for my washing machine or something, which is a more benign example, my email is linked to that document now.

Then combine this with (1) the internet archive says in multiple spots that they dont reveal this info anywhere, and (2) the issue has been raised to the organization, and it becomes more of a specific negligence from them.

[–] megaman@discuss.tchncs.de 2 points 1 month ago

The nsa wants to watch people who are watching the pornhub video of someone else watching porn. The third level there is more difficult to find

[–] megaman@discuss.tchncs.de 6 points 1 month ago

Playing games was fine - it was loading things up that has sucked. I haven't gotten dota up on the SSD yet, but on the HDD it was real clunky and would half-load the landing page and sit there for ~10 seconds.

The biggest difference, though, is that firefox now opens immediately instead of taking ~10 seconds after clicking the icon

 

I installed pop!_os as my daily driver some months ago (completely got rid of windows) and have thought it pretty good. But something about it seemed off - it would take programs just too long to open, it wasn't snappy... Once I got into something it seemed to run fine (playing dota or something else was fine after initial quirks).

Well, today, figured it out...

When I did the first install, I was very nervous about deleting all of my existing data on my disks and so tried to manually partition everything so that I could get it right (I think I was also planning to dual-boot).

Fast forward to today, and I'm testing speeds on all the drives to see which one to pitch for a new one I acquired. I see the 3 HDDs, but where is the SSD... Oh god, I installed the boot partition and root and home all onto one of the ~12 year old HDDs and the SSD has been sitting idle.

Anyway, just about done with the new fresh install onto the SSD, hopefully it isn't too hard to start port over the home directory from that HDD...

[–] megaman@discuss.tchncs.de 29 points 2 months ago (1 children)

It sounds like you have a heavy duty door lock to be very secure, but you are essentially trying to backdoor all that security with a new internet-connected thing. An adversary only has to break the weakest link here, rendering the physical door lock obsolete.

If you are just going to have some digitally-connected device ultimately controlling access to the house, I'd go with just some standard door lock that does that (i haven't used em but they exist). The physical lock on those is surely less what you have know, but with your proposed solution the physical lock probably isnt what people who crack anyway.

 

Friend who is not a software person sent me this tweet, which amused me as it did them. They asked if "runk" was real, which I assume not.

But what are some good examples of real ones like this? xz became famous for the hack of course, so i then read a bit about how important this compression algorithm is/was.

 

An android messaging app that sends everything as an image where the text is in a blue bubble. All images, baby.

 

So, I know very little and have a poor understanding of the software licenses, hence why I'm asking.

I have a 'smart' thermostat that came with the new HVAC system. It is the AprilAire 8920W. It has a touchscreen, connects to wifi, does lots of 'computer' things. I cannot imagine that this furnace company built their own OS and kernel and everything else from scratch; it seems most likely it is running linux, yea? And with that, includes libraries and other tools that are under some version of the GPL, yea?

I went down the router rabbit-hole some weeks ago and found the firmware for routers available on the Linksys website, the Linksys site has this 'GPL Code Center'. I'm finding nothing of the sort from AprilAire, though...

So, if we assume that my 'smart' thermostat is running Linux (and, say, busybox, a common GPL-ed tool on small systems, like routers), they are obligated to provide the code for at least those pieces of software, right? They need to give me a CD or have a page on their website (and include the link in the manual) and all that?

Do they need to give me access to the entire firmware as well? The router folks do, but you also sometimes need to re-install the firmware manually, so that may not be a license issue.

However, how would we know if they are violating a license if we don't know what is running on it?

I'm curious about how the GPL / copy-left licenses work, and wondering if I found someone who is violating it. I also want to hack the thermostat to control it without the motherfuckin' cloud, but that is a bit separate.

14
submitted 10 months ago* (last edited 10 months ago) by megaman@discuss.tchncs.de to c/selfhosted@lemmy.world
 

I've got my main house server that has a number of dockerized applications, including nextcloud-aio. Nextcloud-AIO comes with a built-in backup system using BorgBackups. I've had this running and doing my backups, it is probably fine. Notable, it does encrypt the backup.

Now, I recently setup a separate machine to use rsnapshot to backup the things from the main machine that need backing up. It is SSHing on a schedule to do that, and backing up the folders I've listed.

When I set that up, I skipped the nextcloud borg backup, because that is already backing up; however, it is not a remote backup, so is of limited use (granted, my 'official' backup computer is using about 18 inches away from the main server, so also of limited use).

I can easily just include the nextcloud-borg-directory on the rsnapshot list, but does anyone know if it will properly handle just the updates?

That is, both Borg and Rsnapshot are set up so that each backup isn't a complete backup but just incremental changes, so that you don't fill your whole disk in two weeks. But if Borg does that first on the nextcloud data, will rsnapshot just not work and then try to backup the full 50GBs every day? Or just do the incremental changes? Will the borg encryption jack up the ability of rsnapshot to see the changes?

If no one knows, I will just do it anyway and report back in a few days if my disk is completely full or not.

Edit: it has been ~4 days, and I think it is not all busted (not going to say it is a good idea). The total space it is taking up on the second (backup) machine is what I expect - it hasn't ballooned because it can't properly grok the borg backup format or anything like that. Importantly, this is after ~4 days and very few changes (updates/deletions/edits) to anything on the nextcloud.

 

Hey, all.

Is it possible to skip this 'register your server' step when creating a self-hosted Rocketchat instance? I just don't want to, ya know? Regular websearching is just giving a lot about how to disable user registration rather than skipping the server registration with Rocketchat HQ.

view more: next ›