Show HN: NoSQL, but it's SQLite
(gist.github.com)91 points by vsroy 2 days ago | 45 comments
Manipulate your SQLite database like a giant Javascript object. Built with o1.
91 points by vsroy 2 days ago | 45 comments
Manipulate your SQLite database like a giant Javascript object. Built with o1.
eterm 2 days ago | prev | next |
It's stored :memory:, there exists the same interface with:
let x = {}
x['foo'] = bar
This is a parody because the implementation is hidden, and I'm not convinced the implementation isn't just newing an object.vsroy 2 days ago | root | parent |
The implementation is very clearly included, if you... scrolled down.
And the point is, you could easy do `const db = new Database("./database.sqlite")` instead.
The wrapper makes it so manipulating your database is just like manipulating a plain Javascript object.
eterm 2 days ago | root | parent |
It's since been edited. At the time of my original post it was importing from a file not present on this page.
jazzyjackson 2 days ago | prev | next |
Not a database, just a map of json strings where you can update the json stored at some key. You could write the same interface on top of localStorage.
vsroy 2 days ago | root | parent |
Yes that is exactly what it is! It's basically a very small upgrade over my favorite database: A JSON-file :)
iKlsR 2 days ago | root | parent | next |
Literally did something similar just last week, was looking for a good redis gui on windows and couldn't find one that clicked (closest I got was Another Redis Desktop Manager but it sorted my keys like 1, 11, 2 etc) so turned to sqlite and implemented this. Also added a "sync to disk" method so I get both the benefits of ram and persistence, worked out great since the data I'm getting over tcp has a sequence number so in case of any errors I resume from the last sequence number in the db. Thinking of fully committing and moving some stuff from the language like decoding the raw bytes to build a json object to an extension.
jazzyjackson 2 days ago | root | parent | prev |
What's it give you above just holding a big object in memory? I guess partial serialization is something, so updates are stored on disk... But then why not just store a json file per key on disk? It's not like the serialized blobs allow you to have indexes or a particularly efficient full text search, so why bother with the SQL statements at all?
b33f 2 days ago | prev | next |
Couchbase mobile has been doing this for over a decade and early versions of membase 15 years ago were using a sqlite backend as a noSQL JSON datastore
messe 2 days ago | root | parent | next |
I'm using something like this for a small personal project that's only going to have a couple of users. Basically, just an app for myself and my girlfriend for all of the various restaurants, movies, recipes, tv shows, locations, etc. that we plan to go to/do at some point in the future. It's basically just a glorified todo list that uses APIs (TheMovieDataBase, OpenStreetMap, etc.) to grab additional metadata and images to present everything nicely
I want us both to be able to make notes/add ratings to each item, so the set of tables looks like this:
- TodoItems
- Notes
- Ratings
Where every TodoItem can have multiple Ratings/Notes attached. Because each of the TodoItems is going to be of a different type with different metadata depending on the type of item (IMDB/TMDB id, image url, GPS location), and I want it to be extensible in future, its schema has ended up looking like this: CREATE TABLE TodoItems (
id INTEGER PRIMARY KEY NOT NULL,
kind TEXT NOT NULL,
metadata BLOB NOT NULL
);
With SQLite's json manipulation functions, it's actually pretty pleasant to work with. As it grows I might end up adding some indexes, but for now the performance seems like it will be fine for this very low traffic use case. And it makes deployment and backups incredibly simple.motorest 2 days ago | root | parent | prev |
Postgres added native support for JSON in 2012. People have been using RDBMS to store denormalized data and even as a key-value store for way longer than that. In fact, it's very hard not to do that
iLoveOncall 2 days ago | prev | next |
The worst of both worlds, perfection.
redwood 2 days ago | prev | next |
PowerSync does something similar I believe
stanac 2 days ago | prev | next |
I did something similar with dotnet and linq. Idea was to create something like marten but for sqlite instead of postgres. Stopped working on it some time ago, the thing that was really slow was de/serialization, but with new source generators for json maybe it can be sped up.
revskill 2 days ago | prev | next |
At work, we're using sql server, and i stored all json as base64 string though.
p2detar 2 days ago | root | parent | next |
Not sure what your exact use case is, I'm curious actually, but storing JSON strings should work much better. JSON functions are supported since SQL Server 2016 [0]. This is how I do it atm. I store only indexible content in table columns and everything else goes into an `attributes` JSON column. MSSQL supports indexes even on JSON fields, but I have not tried that, yet.
0 - https://learn.microsoft.com/en-us/sql/relational-databases/j...
revskill 2 days ago | root | parent |
I'm still figuring out why i do this.
dontdoxxme 2 days ago | root | parent | prev | next |
Submitted it to The Daily WTF yet?
revskill 2 days ago | root | parent |
Why ??
eterm 2 days ago | root | parent | next |
Others are being mean by not explaining the joke.
Firstly, SQL server has a built-in JSON type, which lets you query and manipulate the JSON directly: https://learn.microsoft.com/en-us/sql/relational-databases/j...
Secondly, JSON is already serialized, so it doesn't make sense to store as a base64 string. You're adding 30% data overhead to transform a string into a string. Base64 is useful for serializing opaque binary formats.
Lastly, some people might be getting a wry smile that you have the power of a relational database but are just trying to store "json" rather than an actual relational model.
mcny 2 days ago | root | parent | prev |
How do you query json with SQL server like let's say you have one data point like this
{ "id": 42, "quantity": 12, bla bla bla
And you want rows where this column has quantity and quantity ≥ 20
How do you do it if you encode everything as base 64?
isoprophlex 2 days ago | root | parent | next |
You slap a full text index on the base64 string. There's only a finite number of base64 substrings for the un-encoded substrings "id", 42, etcetera, so you first filter on those. Then you decode those full strings into json and do the final filtering application side. Easy!
6510 2 days ago | root | parent | prev |
<joking>have col names id, quantity, json and greaterthan20
thih9 2 days ago | root | parent |
This is only a joke until a manager hears it. Then it’s part of the Q1 roadmap and we will refactor it in Q3.
revendell_elf 2 days ago | root | parent | prev |
[dead]
richrichie 2 days ago | prev | next |
Doesn’t sqlite-utils does this and more, better?
keepamovin 2 days ago | prev | next |
Beautiful. Please turn it into a repository. You wrangled that AI masterfully for this. Well done! :)
chx 2 days ago | prev | next |
> Built with o1.
Yes, yes, database with AI written code. NoSQL with a database that can't be trusted with your data? I. have. seen. this. before. To quote a classic:
> I suggest you pipe your data to devnull it will be very fast
In defense of the database that video was about, I worked as a software architect for the company which became the first commercial user of it, Eliot hilariously didn't want to accept money for support at first. Good old days. However, around 2015 when all three large open source SQL databases --- SQLite, PostgreSQL, MySQL -- added JSON support I felt there was no more need for these NoSQL systems, though.
chx 2 days ago | root | parent |
[flagged]
vsroy 2 days ago | prev | next |
dang -- why was this flagged? Seems like a perfectly reasonable post.
hda111 9 hours ago | root | parent | next |
Maybe because it’s written with a LLM
dang 2 days ago | root | parent | prev |
Users flagged it. We can only guess why users flag things, but there are usually clues in the comments. I've turned the flags off now.
p.s. @dang doesn't work - you have to email hn@ycombinator.com if you want mostly-guaranteed message delivery.
revendell_elf 2 days ago | prev | next |
[dead]
yakorevivan 2 days ago | prev | next |
[dead]
cwillu 2 days ago | prev | next |
[flagged]
cwillu 2 days ago | root | parent | next |
I'd be a little less reflexively hostile if the prompting used to generate it was prominently included, so that it was at least possible to see which properties of the output were deliberately asked for.
When I read a piece of code I don't understand, ideally I'm able to assume that the author intended what was written, and learn something when I figure out why. With AI generated code, I can't do that: the original author is more or less destroyed once the context is gone; at best I can ask someone who was kinda-sorta around when the author wrote it (i.e. the person who prompted it). At worst I'm really just asking someone (or something) else to guess what was intended.
Even granting that the generated code is implementing a coherent design (and that's a big if!), what we're left with is a very quick way to generate legacy code who's author is no longer around to answer questions. Such legacy code, if it's been proven to work, I might just stick a “don't touch this” sign on it, but if it starts causing any problems, I'd end up doing a ground-up rewrite if only to (potentially) understand what problems and concerns influenced the design to make it what it is.
AI can certainly be useful for this later task, but it'd be far preferable to just start there, as a form of pair-programming. And at that point, I doubt it'd be so interesting to say “built with ‹whatever model is popular today›” in the title.
2 days ago | root | parent | next |
vsroy 2 days ago | root | parent | prev |
The API is basically the prompt LOL. I chucked in the API, and had o1-pro write tests + fix bugs until it worked.
iLoveOncall 2 days ago | root | parent | prev |
Seems to mean "negative-acknowledgement" if like me you've never seen this abbreviation used before.
And yeah I agree.
vsroy 2 days ago | root | parent |
o1-pro understands JS proxies better than me tbh, so probably a positive acknowledgment.
iLoveOncall 2 days ago | root | parent |
The fact that you don't seem to realize how this is even worse is scary.
isoprophlex 2 days ago | prev | next |
[flagged]
vsroy 2 days ago | root | parent |
100% ;)
2 days ago | prev |
senko 8 hours ago | next |
I've built this (unironically) for Python: https://github.com/senko/dante
It combines the convenience of SQLite (nothing to install) with the convenience of just throwing stuff in a dict. Perfect for quick prototypes/mockups.