all 38 comments

[–][deleted] 31 points32 points  (4 children)

Check this guy out: https://www.tiktok.com/@whiskytango23

Sorry that it's a TikTok account, it just happens to be where he posts the most.

He's building a doomsday knowledge vault, with 2TB+ of data, and it's own LLM to interact with it, along with a bunch of other stuff. Based on a Raspberry Pi in a Pelican case.

[–]Minute_Attempt3063 13 points14 points  (0 children)

Man, if they were on YouTube, they would have had al much more attention....

[–]SpiritualWeight4032 0 points1 point  (0 children)

So it’s a rugged laptop with some software downloaded and an SDR connected?

You can do the same for free on your windows computer by installing - Kiwix (for Wikipedia) - LM Studio (for local AI) - SDRConsole (SDR)

[–]Some_Endian_FP17 0 points1 point  (0 children)

I hope it's protected against EMP somehow.

[–]Red_Redditor_Reddit 24 points25 points  (9 children)

If you want that kind of info, honestly your best bet are military training manuals. You can buy the book, the PDF's are free, and and they're simple enough to understand while being shot at.

I seriously would avoid using a LLM for anything that's mission critical.

[–]PoweredByMeanBean 10 points11 points  (6 children)

I think it depends on what OP means by "rebuilding civilization". Like if WW3 cripples our ability to manufacture computer chips and it takes down the internet, I won't know how to build an analogue-only coal power plant. And it's possible no one else will either. So having an LLM that can explain all the steps required in great detail would really help get local power generation back online until smarter people than me get everything built back out, which could take years.

Even stuff as basic as concrete I might not know how to make using locally available materials. But with a few prompts and follow up questions to Chat GPT, I was able to get information about the naturally recurring precursors to the chemicals in cement, what temperature to heat the kiln to, the chemical processes involved etc. which won't be covered by a traditional survival guide.

[–]Red_Redditor_Reddit 3 points4 points  (5 children)

I think it depends on what OP means by "rebuilding civilization".

If he means like 72 virgins, I don't think the LLM is gonna help lol. Seriously though, a copy of wikipedia would be far more helpful then a LLM.

On a side note, part of my job is with the production of concrete, so I asked the mistral nemo how to make concrete using materials found in the wild. The first time it didn't give me instructions on how to make concrete, but rather a sort of lime treated soil brick. It would technically work but it's more of a stabilized mud brick than true concrete.

When I asked specifically how to make portland cement, it did give me a technically correct list of instructions, however it wasn't really a practical list of instructions. Different areas have different materials available that can work but aren't on the list. It also didn't speak anything about admixtures nor reinforcement. Even if you were to make a functional concrete with these instructions, it would harden literally within a few minutes without admixtues and could only be used where it's under a compression load. It wouldn't be useful in the ways people see concrete today.

[–]redoubt515 1 point2 points  (4 children)

Seriously though, a copy of wikipedia would be far more helpful then a LLM.

It probably would be more useful (and projects like that already exist). But its a fairly different sort of helpful/useful compared to an LLM. One doesn't eliminate the usefulness of the other.

[–]Red_Redditor_Reddit 2 points3 points  (3 children)

i donno. As someone who knows concrete, I would give that model a solid C-. Its not giving totally bad info, but its super far from actually being useful. Some things really need to be studied to be done right.

[–]redoubt515 2 points3 points  (2 children)

A solid C- (across a broad range of subjects) isn't so bad for the context OP is talking about.

Condensing information down to a super compressed form (whether a traditional encyclopedia, Wikipedia, or a small LLM that can fit on a low power consumer device) will never be of the same caliber or depth as actual expertise and deep knowledge of a subject.

[–]Red_Redditor_Reddit -1 points0 points  (1 child)

What I'm saying is that the LLM isn't really useful. At the very least wikipedia on non-political subjects isn't usually making stuff up. It's also far more clear on implied limitations. That LLM will not only make stuff up, but it will do it in a way that makes you think it 100% knows what it's talking about.

[–]redoubt515 0 points1 point  (0 children)

That LLM will not only make stuff up, but it will do it in a way that makes you think it 100% knows what it's talking about.

Yeah, that's probably the biggest problem with LLMs. Still they can be quite useful if you understand the limitations and bear them in mind when using an LLM. I treat talking to an LLM like talking to a confident-sounding redditor (typically I can learn a lot, but a not inconsequential percent of what it says can be wrong, I treat it skeptically, and try to stay conscious that sounding right and being right are two different things).

[–]Top-Opinion-7854 0 points1 point  (1 child)

Where are the pdfs?

[–]Red_Redditor_Reddit 0 points1 point  (0 children)

A quick google search:

https://static.e-publishing.af.mil/production/1/af_a3/publication/afh10-644/afh10-644.pdf

There's more. Just google. It's not like they're hidden somewhere or under copyright.

[–]rorowhat 11 points12 points  (3 children)

You can download the complete Wikipedia for offline viewing, around 110GBs via kiwix

[–]PoweredByMeanBean 2 points3 points  (2 children)

What is the best way to store that digitally long term?

[–]rorowhat 1 point2 points  (1 child)

I got a USB hard drive with all the tools, LLMs , wikis etc in case crap goes down. I also have a laptop that I never use but have loaded up with goodies.

[–]unculturedperl 2 points3 points  (0 children)

Test it at least every year.

[–]Syzygy___ 8 points9 points  (2 children)

I think RAG would be a better approach here.

Find the most lightweight model that handles RAG well, and put all the relevant information in a database.

Considering that people have gotten even Mistral and Llama 2 running on phones (probably more recent models as well), I think that should be well within the realm of possibility with a little effort.

[–]tron_cruise 1 point2 points  (0 children)

I wouldn't do this without RAG. Quickly accessing the relevant reference material would be one of the main benefits.

[–]mamelukturbo 1 point2 points  (0 children)

RAG is definitely the way to go for data consistency, I have a chat with few websearches saved in sillytavern's databank and the data is always pulled nicely when I bring it up in the chat, although when testing this some websearches themselves pull irrelevant data or so much data the LLM can't decide what is relevant. So it all is about how good data you feed the databank in the end.

I can load up to 12B (including mistral12binstruct) model on android phone oneplus 10t, but no idea whether the app ChatterUI supports RAG. Maybe running koboldcpp in termux could be an option, not 100% sure how well koboldai handles RAG, I only played around with rag/databank it in SillyTavern and open-webui, but nothing extensive. On the phone the context length could become an issue, I think I didn't get over 16k with the larger models.

[–]nntb 2 points3 points  (2 children)

There are, but the performance leaves much to be desired

[–]mamelukturbo 1 point2 points  (0 children)

Oneplus 10T + ChatterUI I can load up to 12B model locally and get 3-8 tokens/sec depending on the size of the model. Not great, not terrible. Faster than most people I know could type still. With smallish model like Gemmasutra 2b it's almost 10 tokens/s and that's all I need for quick...research. Considering it's an old phone I presume a 2024 flagship would fare a lot better.

[–]schorhr 0 points1 point  (0 children)

As long as even larger LLM output: "... When comparing cow eggs and chicken eggs, cow eggs contain more vitamin D ..." I wouldn't trust any output for survival ;-)

[–]Some_Endian_FP17 2 points3 points  (0 children)

I would get a full RAG setup working on a laptop so you can get a big chunk of human knowledge in case SHTF.

A nuclear war would render all that meaningless though. EMPs from high altitude nuclear detonations would fry all unprotected electronics so your phone or laptop becomes a paperweight, a souvenir of a vanished civilization.

[–]Full-Sense5308Llama 7B 1 point2 points  (0 children)

Is Gemma capable?

[–]Minute_Attempt3063 1 point2 points  (1 child)

Honestly, I would never trust any ML model on life or death situations, when there are little to no humans left.

We didn't need it before, he'll we don't need it now

[–]DamionDreggs 0 points1 point  (0 children)

We had a tribe with experience and culture in the arts of wilderness survival before

[–]ActualDW 1 point2 points  (0 children)

Just print a damn book, lol.

[–]Status-Shock-880 0 points1 point  (0 children)

This is a fascinating topic- frankly i would make sure you also have an archive of survival and basic skills manuals too, or fine tune something to, because i’m thinking the first bunch of decades will be focused on pretty basic tasks.

[–]DeProgrammer99 0 points1 point  (2 children)

There are a few anime series with that plot.

[–]DeProgrammer99 0 points1 point  (0 children)

https://www.anime-planet.com/anime/in-another-world-with-my-smartphone is the obvious one, but I vaguely remember seeing one or two others... maybe https://www.anime-planet.com/anime/death-march-to-the-parallel-world-rhapsody was one. Either way, they're not exactly trying to rebuild civilization, nor is it after an apocalypse, but they're in a less advanced world using a smart phone for modern information.

[–]unculturedperl 0 points1 point  (0 children)

Make sure it has instructions on procuring toilet paper after the apocalypse.

[–]Expensive_Mode_3413 -2 points-1 points  (0 children)

We both had a similar idea 💡 Someone should make a startupto address this gap.