Brian Lovin
/
Hacker News
Daily Digest email

Get the top HN stories in your inbox every day.

lolovaldez

Have to say, if you like to do bare metal SQL, using the nice features of Postgres, working with this lib is a total Joy!

No “fancy” and bleeding abstractions like something like Prisma, not convoluted type annotations like other TS libs, no, just plain SQL statements in the best JavaScript way of doing it.

I discovered learning to do Postgres a couple years ago, after getting sick of trying to hack Prisma, tried out several others, and after a couple minutes using this, i knew I wasn’t going back.

cultofmetatron

I'd love to see a library like ecto (eilxir) for javascript. its easily the best toolkit for working with sql I've ever found. simple, powerful and easy to reason about

cpursley

If you like ecto but want something closer to the metal (no need to define type mapping), then check out https://github.com/robconery/moebius

I vastly prefer it over ecto, and ecto is awesome.

jmondi

Curious what you think makes it better than other examples out there? How is it different than Active Record or Prisma?

cultofmetatron

the list would merit its own dedicated blog post but to highlight a few nice parts, ecto lets me write sql code in elixir. it does not try to wrap it all into objects but I do get schemas. so for instance I can create a schema `Player` which maps to a table "players"

I can then get them via ``` from(p in Player) |> where([p], p.id in ^player_ids) |> select([p], p) |> Repo.all()

```

need to join with an association?

```

from(p in Player) |> join([p], s in assoc(p, :scores), as: :score) |> where([p], p.id in ^player_ids) |> preload([p, score: s], [score: s]) |> select([p], p) |> Repo.all() ```

this gets me all players with their scores.

need to get the sum of scores for all players?

```

from(p in Player) |> join([p], s in assoc(p, :scores), as: :score) |> where([p], p.id in ^player_ids) |> select([p], %{ player_id: p.id, score: sum(score.value) }) |> group_by([p, score: s], [p.id]) |> Repo.all() ```

as you can see, you can unwind this pretty fast to sql in your head. but you can also let the system just grab the kitchen sink when you jsut need to get somethign working fast. and its all very easy to modify and decuple.

this is hardly exhaustive though, I could spend a day talking about ecto. I spent 8 years developing in nodejs but now I work exclusively in elixir. unless what you're doing is 90% done by existing libraries, its juts a better tool for building highly performant backends.

esafak

prisma appears to have a transaction problem. https://github.com/prisma/prisma/discussions/12715

FpUser

Was never fond of layers above SQL either. One would have to pry SQL from my cold dead fingers

Rapzid

I love layers above sql because it can make a lot of domain logic nice to model. But dropping down when it it makes sense and finding a rock solid foundation is nice too.

<3 Entity Framework Core but also love the new interfaces that let me cast a db connection to Npgsql connection and use its advanced features and raw performance.

It would be great if postgresjs could underpin Knex and MikroOrm.

ianberdin

Yea, it is a totally fan to work with this code after a responsible developer leaves a company. No types, no annotations, no abstractions, no migrations, just plain SQL.

porsager

I can't read if you're being snarky or if you are being earnest.

If snarky, is this based on something that actually happened to you? If so, I would love to hear how that actually went about, and what it is you are unable to grasp when things aren't bogged down by complexity?

If earnest, I'm glad more people prefer code that isn't littered with premature abstractions, redundant types, and useless comments expressing what can be more clearly read in the actual code.

ttyyzz

What made you dislike Prismas raw query feature?

Rapzid

This library is proper fast. Don't get distracted comparing this to other libraries just on the basis of similar string template syntax; it doesn't have a dependency on pg because it implements its own low level client.

I've built systems on this loading multiple 10k records at time and it crushes.

Rapzid

To expand on my own reply..

Can load 10-20k records(ymmv) in 20ms or under where it would otherwise take 150ms. That can be a game changer based on use case by itself, but if you are used to thinking about what's viable in Ruby or etc it's even more inspiring.

It also supports logical decode which opens a lot of doors for processing transitional events.

Very impressive project; don't sleep on it!

dang

Related:

Show HN: Postgres.js – Fastest Full-Featured PostgreSQL Client for Node and Deno - https://news.ycombinator.com/item?id=30794332 - March 2022 (83 comments)

Show HN: Postgres.js – Fast PostgreSQL Client for Node.js - https://news.ycombinator.com/item?id=21864245 - Dec 2019 (12 comments)

terraplura

Perhaps not as featured as OP’s but I’ve found https://pgtyped.dev/ useful. Queries are written as SQL files, and pgtyped creates typed functions you call with query parameters.

ttfkam

Impressive

porsager

Author here. Worth noting is that Cloudflare is now also supported :)

laurels-marts

this library looks incredible. well done!

pyrolistical

I like how it has zero deps

geniium

Rare enough to highlight it

noduerme

>> Prepared statements will automatically be created for any queries where it can be inferred that the query is static

What does this mean in practice? Like, actual prepared statements are created against the DB session if there are no bound variables, even if that query is only made once?

If so, it's an interesting but highly opinionated approach...

latch

If you're using parameterized queries, then you _have_ to use PostgreSQL's "Extended Query" flow, which is what most people would think of as a "prepared statement". This is hardly opinionated.

But normally, you use an unnamed prepared statement and/or portal, which PG will clean up for you, essentially only letting you have one of those per session (what we think of as a connection).

I agree that sentence didn't make any sense. So I looked at the code (1) and what they mean is that they'll use a named prepared statement automatically, essentially caching the prepared statement within PG and the driver itself. They create a signature for the statement. I agree, this is opinionated!

(1) The main place where the parse/describe/bind/execute/sync data is created is, in my opinion, pretty bad code: https://github.com/porsager/postgres/blob/bf082a5c0ffe214924...

butlerm

In most SQL databases, once you have a connection handle, you can allocate an arbitrary number of statement handles, and each SQL "statement" can be executed independently, has an associated query, parameters, a result set, and so on.

It is entirely ordinary with an API like that to prepare a statement, bind parameters and columns, and execute and fetch the results. You can then reuse a statement in its prepared state, but usually with different parameter values, as many times as you want within the same session.

The performance advantage of doing this for non-trivial queries is so substantial that many databases have a server side parse cache that is checked for a match even when a client has made no attempt to reuse a statement as such. That is easier if you bind parameters, but it is possible for a database to internally treat all embedded literals as parameters for caching purposes.

fauigerzigerk

Looks like named prepared statements can be disabled by setting "prepare" to false in the connection settings:

https://github.com/porsager/postgres#connection-details

noduerme

Thanks... I haven't got the mental energy to follow their code ATM but yeah, it seems weird to buffer a static query as a prepared statement if it's only going to be used once.

Maybe that kind of goes with a Nodejs philosophy, though? It seems like an assumption that in most cases a static query will recur... and maybe that's usually accurate with long running persistent connections. I'm much more used to working in PHP and not using persistent connections, and so sparing hitting a DB with any extra prepare call if you don't have to, unless it's directly going to benefit you later in the script.

ndriscoll

I believe it's typical for database libraries with auto-prepare to also have batch statement sending (and use it for prepare-executes), so there's no real cost to doing the prepare. The database has to parse/plan the statement regardless, and there's no round-trip delay with statement batching.

porsager

The input parameters are still dynamic which is made explicit with tagged template literals, so that makes almost all queries static in nature.

noduerme

my own, very opinionated way of doing this in nodejs was to wrap node-mysql in promises and then force myself to make it explicit when preparing a a query whether I want it to set up a prepared statement which returns a reusable handle or run directly. That requires tracking prepared statement names and key/values for each one.

you'll probably find this bad code too, but it was more of an experiment... I still don't feel safe using nodejs in deployment.

https://github.com/joshstrike/StrikeDB/blob/master/src/Strik...

seanhunter

Postgres supports 2 types of prepared statements - "generic" prepared statements (which can accept parameters) and "custom" which are prepared for a specific execution so bake the parameters in.

https://www.postgresql.org/docs/current/sql-prepare.html explains it. Read the section called "Notes" for the plan types.

skrebbel

I feel obliged to namedrop Zapatos here, another postgres JS client with a remarkably similar design (sql in template strings), but fully typesafe. If TypeScript is your thing, you might appreciate it: https://jawj.github.io/zapatos/

Personally I feel it’s one of the best designed (and documented) TS libraries out there and I’m sad it’s not very well known.

xctr94

We had to abandon Zapatos because a) it doesn’t support multiple schemas; b) the types wouldn’t always be very readable.

PgTyped was the alternative, and 2 years later I’m very glad we made the switch.

gmac

It does now support multiple schemas: https://jawj.github.io/zapatos/#60

9dev

I don’t know if this has changed recently, but I’m using multiple schemas with Zapatos just fine. The type you can import looks like `auth.user.Selectable`, where auth is the schema and user the table.

Exuma

What do you mean multiple schemas ? What is an example of a non readable type?

pimeys

Like this, it's pretty common way to use Postgres (with multiple schemas):

  CREATE SCHEMA a;

  CREATE SCHEMA b;

  CREATE TABLE a.foo (id SERIAL PRIMARY KEY);

  CREATE TABLE b.foo (id SERIAL PRIMARY KEY);

xctr94

I didn’t know support for schemas had landed, but the docs still seem obtuse for joins and upserts. Compared to plain SQL files in PgTyped — which admittedly needs some annotations to work, and lacks transactions —, I find the lateral joins in Zapatos really verbose and ugly. Lots of extra parameters too.

Which isn’t to say it’s not a great tool! You pick what you like :)

hhfdsbvc

Ever try drizzle?

xctr94

I haven’t, but it’s an ORM so an automatic no. We tested 6 ORMs before deciding that plain SQL with introspection was the way to go. No regrets.

9dev

Seconded. The lateral join features in combination with the shortcut functions make it exceptionally easy to build backends with actual SQL, without compromising on ease of use.

oulipo

Does that work also with Postgres-compatible backends like Bigquery et al ?

gmac

It relies on pg (node-postgres) to talk to Postgres, so it should work with anything pg can talk to.

willtemperley

At first glance Zapatos doesn't appear to be Cloudflare workers compatible though I'd love to be corrected.

gmac

It should be, now that pg runs fine there. If it’s not, please file an issue!

It also works with @neondatabase/serverless on platforms without TCP connections (though it’s on my TODO list to make this less fiddly): https://github.com/neondatabase/neon-vercel-zapatos

jokesterfr

Do you know why zapatos relies on pg and not PostgresJs?

qwertox

I was wondering how old this project is:

v1.0.1 - Jan 2020

v2.0.0 - Jun 2020 but never left beta.

v3.0.0 - Mar 2022, which appears to be when the project really got started.

I also wonder how solid it is, because it looks very interesting.

promiseofbeans

I've been using it in production (Deno) for a while. It's pretty solid, but their documentation is sorely lacking, so you often have to feel your way to getting more complex things to work.

twosdai

Could you comment on what you're using deno in prod for? I'm curious to see and learn more about deno for production cases. I'm a big ts and nodejs user, and deno always seemed interesting to me. However I am afraid of being an early mover, so I like to learn more about use cases people have validated.

promiseofbeans

We use it for processing real-time UDP streams, as well as storing the data in postgres (as well as a hosting small API for exporting said data). We also use it to re-publish the UDP streams in a WebSocket format, for consumption by web applications.

surprisetalk

Postgres.js is one of the best libraries I’ve ever used!

Its use of templating in JS is really intuitive.

fatih-erikli

I like the idea of using tagged templates for sanitizing sql query inputs. Kudos

andrewstuart

Terrible name, makes it impossible to search on.

The problems start immediately when you realize it’s interchangeably called “Postgres”.

Any other name would have been better as long as it’s distinctive.

Even so I use it, best Postgres javascript library.

hluska

Try searching:

npm install postgres

My search results are always excellent. Google and DuckDuckGo deliver the documentation as first result. Bing delivers a stackoverflow answer about pg as the first result but the documentation is the second result.

I don’t really see any other problems or rather the problems are common and most developers have developed habits (like the search pattern above) to solve them.

Daily Digest email

Get the top HN stories in your inbox every day.

PostgresJs: PostgreSQL client for Node.js and Deno - Hacker News