Datasets:
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Error code: FileSystemError
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
id uint32 | deleted uint8 | type int8 | by string | time timestamp[us] | text string | dead uint8 | parent uint32 | poll uint32 | kids list | url string | score int32 | title string | parts list | descendants int32 | words list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
47,811,904 | 0 | 2 | DoctorOetker | 2026-04-18T02:00:13 | which exact model, and how many tokens per second for generation? | 0 | 47,798,146 | 0 | [] | 0 | [] | 0 | [
"and",
"exact",
"for",
"generation",
"how",
"many",
"model",
"per",
"second",
"tokens",
"which"
] | ||
47,811,905 | 0 | 2 | tkzed49 | 2026-04-18T02:00:19 | Pixel 10 absolutely ate shit when I opened the page! | 0 | 47,809,158 | 0 | [] | 0 | [] | 0 | [
"10",
"absolutely",
"ate",
"opened",
"page",
"pixel",
"shit",
"the",
"when"
] | ||
47,811,906 | 0 | 2 | IndignantTyrant | 2026-04-18T02:00:32 | Taalas seems to be pretty good.
This is their demo:
<a href="https://chatjimmy.ai/" rel="nofollow">https://chatjimmy.ai/</a> | 0 | 47,811,121 | 0 | [] | 0 | [] | 0 | [
"ai",
"be",
"chatjimmy",
"demo",
"good",
"href",
"https",
"is",
"nofollow",
"pretty",
"rel",
"seems",
"taalas",
"their",
"this",
"to",
"x2f"
] | ||
47,811,907 | 0 | 2 | hargup | 2026-04-18T02:00:35 | Justin Lebar (he built xla compiler and worked at OpenAI) has an amazing talk about this subject <a href="https://youtu.be/cyJU32ivIlk?si=gYuHtzMJIvaSqcht" rel="nofollow">https://youtu.be/cyJU32ivIlk?si=gYuHtzMJIvaSqcht</a> | 0 | 47,807,619 | 0 | [] | 0 | [] | 0 | [
"about",
"amazing",
"an",
"and",
"at",
"be",
"built",
"compiler",
"cyju32ivilk",
"gyuhtzmjivasqcht",
"has",
"he",
"href",
"https",
"justin",
"lebar",
"nofollow",
"openai",
"rel",
"si",
"subject",
"talk",
"this",
"worked",
"x2f",
"xla",
"youtu"
] | ||
47,811,908 | 0 | 2 | hn_acker | 2026-04-18T02:00:42 | Cory Doctorow argued against using copyright law as a substitute for privacy law or labor law [1], and I argue the same for location privacy. Copyright law already gets abused enough in contexts that indisputably involve copying of creative work. I do not want to stretch copyright law into location/movement privac... | 0 | 47,807,160 | 0 | [] | 0 | [] | 0 | [
"10",
"2023",
"21",
"abused",
"against",
"already",
"an",
"and",
"app",
"argue",
"argued",
"as",
"at",
"belongs",
"by",
"comes",
"contexts",
"copy",
"copying",
"copyright",
"cory",
"created",
"creative",
"device",
"did",
"do",
"doctorow",
"does",
"enough",
"... | ||
47,811,909 | 0 | 2 | ccrone | 2026-04-18T02:00:56 | Neat! I work with the team on sbx. We built our own cross-platform VMM after running into limitations with the existing options. Happy to chat more about what you’ve built and what we’re doing: christopher<dot>crone@docker.com | 0 | 47,809,773 | 0 | [] | 0 | [] | 0 | [
"about",
"after",
"and",
"built",
"chat",
"christopher",
"com",
"crone",
"cross",
"docker",
"doing",
"dot",
"existing",
"gt",
"happy",
"into",
"limitations",
"lt",
"more",
"neat",
"on",
"options",
"our",
"own",
"platform",
"re",
"running",
"sbx",
"team",
"th... | ||
47,811,910 | 0 | 2 | BoorishBears | 2026-04-18T02:01:42 | I'm 99.9% sure Opus 4.7 is a smaller model than 4.6.<p>Too many signs between the sudden jump in TPS (biggest smoking gun for me), new tokenenizer, commentary about Project Mythos from Ant employees, etc.<p>It looks like their new Sonnet was good enough to be labeled Opus and their new Opus was good enough to be l... | 0 | 47,810,538 | 0 | [] | 0 | [] | 0 | [
"99",
"about",
"and",
"ant",
"as",
"be",
"between",
"biggest",
"commentary",
"continue",
"employees",
"enough",
"etc",
"for",
"from",
"good",
"gun",
"in",
"is",
"it",
"jump",
"labeled",
"like",
"ll",
"looks",
"many",
"me",
"model",
"more",
"mythos",
"new",... | ||
47,811,911 | 0 | 2 | stingraycharles | 2026-04-18T02:01:43 | Depends on the type of subscription. We have Codex Team and have a monthly subscription, no per-token costs. | 0 | 47,811,228 | 0 | [] | 0 | [] | 0 | [
"and",
"codex",
"costs",
"depends",
"have",
"monthly",
"no",
"of",
"on",
"per",
"subscription",
"team",
"the",
"token",
"type",
"we"
] | ||
47,811,912 | 0 | 2 | turtleyacht | 2026-04-18T02:01:59 | If that's true, what <i>is</i> "ego death?" | 0 | 47,811,863 | 0 | [] | 0 | [] | 0 | [
"death",
"ego",
"if",
"is",
"quot",
"that",
"true",
"what",
"x27"
] | ||
47,811,913 | 0 | 2 | grosswait | 2026-04-18T02:02:08 | If so, they are still going <a href="https://www.tesla.com/solarpanels" rel="nofollow">https://www.tesla.com/solarpanels</a> so I guess not | 0 | 47,811,374 | 0 | [] | 0 | [] | 0 | [
"are",
"com",
"going",
"guess",
"href",
"https",
"if",
"nofollow",
"not",
"rel",
"so",
"solarpanels",
"still",
"tesla",
"they",
"www",
"x2f"
] | ||
47,811,914 | 0 | 2 | mtlynch | 2026-04-18T02:02:39 | Corresponding HN thread: <a href="https://news.ycombinator.com/item?id=47758309">https://news.ycombinator.com/item?id=47758309</a> | 0 | 47,806,568 | 0 | [] | 0 | [] | 0 | [
"47758309",
"com",
"corresponding",
"hn",
"href",
"https",
"id",
"item",
"news",
"thread",
"x2f",
"ycombinator"
] | ||
47,811,915 | 0 | 2 | ASalazarMX | 2026-04-18T02:02:41 | - TSA: Hey, bring your bag and devices here. Routine inspection.<p>- Traveler: [takes phone from the bin] [finds lock button] [click] [click] [click]<p>- TSA: Hey, stop what you're doing Mr. Terrorist! | 0 | 47,808,737 | 0 | [] | 0 | [] | 0 | [
"and",
"bag",
"bin",
"bring",
"button",
"click",
"devices",
"doing",
"finds",
"from",
"here",
"hey",
"inspection",
"lock",
"mr",
"phone",
"re",
"routine",
"stop",
"takes",
"terrorist",
"the",
"traveler",
"tsa",
"what",
"x27",
"you",
"your"
] | ||
47,811,916 | 0 | 1 | beeswaxpat | 2026-04-18T02:02:42 | 0 | 0 | 0 | [] | https://agenticdev.blog/ | 1 | Show HN: Agentic Dev – AI dev-tools news, curated daily by Claude | [] | 0 | [
"agentic",
"ai",
"by",
"claude",
"curated",
"daily",
"dev",
"hn",
"news",
"show",
"tools"
] | |
47,811,917 | 0 | 2 | georgemcbay | 2026-04-18T02:02:55 | Yeah it was Columbia that was destroyed on reentry (17 years later). | 0 | 47,810,912 | 0 | [] | 0 | [] | 0 | [
"17",
"columbia",
"destroyed",
"it",
"later",
"on",
"reentry",
"that",
"was",
"yeah",
"years"
] | ||
47,811,918 | 0 | 2 | raphman | 2026-04-18T02:02:56 | Yeah. I'm only on the Pro plan and immediately reached my weekly Claude Design quota by having it create a slide template (with much too small text) and three versions of a system dashboard design (rather nice). No iterations.<p>Another thing: I realized how much I hate waiting for Claude to finish its thing. With... | 0 | 47,809,455 | 0 | [] | 0 | [] | 0 | [
"and",
"another",
"between",
"by",
"claude",
"code",
"create",
"dashboard",
"design",
"designs",
"feels",
"finish",
"for",
"hate",
"having",
"how",
"immediately",
"important",
"interaction",
"it",
"iterations",
"its",
"loop",
"more",
"much",
"my",
"nice",
"no",
... | ||
47,811,920 | 0 | 2 | consumer451 | 2026-04-18T02:03:08 | As a huge space nerd, I would like to point out that space, and other planets really suck.<p>The Earth is pretty cool, btw. | 0 | 47,808,913 | 0 | [] | 0 | [] | 0 | [
"and",
"as",
"btw",
"cool",
"earth",
"huge",
"is",
"like",
"nerd",
"other",
"out",
"planets",
"point",
"pretty",
"really",
"space",
"suck",
"that",
"the",
"to",
"would"
] | ||
47,811,921 | 0 | 1 | pedalpete | 2026-04-18T02:03:11 | 0 | 0 | 0 | [] | https://www.wired.com/story/this-beanie-is-designed-to-read-your-thoughts/ | 1 | Beanie Is Designed to Read Your Thoughts | [] | 1 | [
"beanie",
"designed",
"is",
"read",
"thoughts",
"to",
"your"
] | |
47,811,922 | 0 | 2 | wakawaka28 | 2026-04-18T02:03:20 | Sure, it's not implemented in Fil-C because it is very new and the point of it is to improve things without extensive rewrites.<p>Generally, I think one could want to recover from errors. But error recovery is something that needs to be designed in. You probably don't want to catch all errors, even in a loop ... | 0 | 47,811,859 | 0 | [] | 0 | [] | 0 | [
"about",
"access",
"all",
"an",
"and",
"apples",
"application",
"as",
"be",
"because",
"but",
"catch",
"could",
"designed",
"don",
"error",
"errors",
"even",
"existent",
"extensive",
"fil",
"for",
"from",
"generally",
"handle",
"handling",
"here",
"if",
"imple... | ||
47,811,923 | 0 | 2 | pedalpete | 2026-04-18T02:03:39 | <a href="https://archive.is/KVNsB" rel="nofollow">https://archive.is/KVNsB</a> | 0 | 47,811,921 | 0 | [] | 0 | [] | 0 | [
"archive",
"href",
"https",
"is",
"kvnsb",
"nofollow",
"rel",
"x2f"
] | ||
47,811,924 | 0 | 2 | datsci_est_2015 | 2026-04-18T02:03:51 | Assuming they mean the ground acts as a heat sink, and sufficiently underground you’re not subjected to the above average heat of the day and below average cold of the night. | 0 | 47,811,843 | 0 | [] | 0 | [] | 0 | [
"above",
"acts",
"and",
"as",
"assuming",
"average",
"below",
"cold",
"day",
"ground",
"heat",
"mean",
"night",
"not",
"of",
"re",
"sink",
"subjected",
"sufficiently",
"the",
"they",
"to",
"underground",
"you"
] | ||
47,811,925 | 0 | 2 | p1necone | 2026-04-18T02:04:02 | Why are you talking like this is black and white? Many things being compile time checkable is better than no things being compile time checkable. The existence of some thing in rust that can only be checked at runtime does not somehow make all the compile time checks that are possible irrelevant. | 0 | 47,811,680 | 0 | [] | 0 | [] | 0 | [
"all",
"and",
"are",
"at",
"be",
"being",
"better",
"black",
"can",
"checkable",
"checked",
"checks",
"compile",
"does",
"existence",
"in",
"irrelevant",
"is",
"like",
"make",
"many",
"no",
"not",
"of",
"only",
"possible",
"runtime",
"rust",
"some",
"somehow... | ||
47,811,926 | 0 | 2 | like_any_other | 2026-04-18T02:04:12 | Is there any species, other than humans, that is found all across the globe (i.e. geographically separated), and has not differentiated into subspecies? Wolves, elephants, tigers, bears, and foxes have all been categorized into multiple subspecies each, distinct but able to interbreed. | 0 | 47,811,283 | 0 | [] | 0 | [] | 0 | [
"able",
"across",
"all",
"and",
"any",
"bears",
"been",
"but",
"categorized",
"differentiated",
"distinct",
"each",
"elephants",
"found",
"foxes",
"geographically",
"globe",
"has",
"have",
"humans",
"interbreed",
"into",
"is",
"multiple",
"not",
"other",
"separate... | ||
47,811,927 | 0 | 2 | gnabgib | 2026-04-18T02:04:18 | (2020) At the time (1514 points, 741 comments) <a href="https://news.ycombinator.com/item?id=25078034">https://news.ycombinator.com/item?id=25078034</a> | 0 | 47,811,869 | 0 | [] | 0 | [] | 0 | [
"1514",
"2020",
"25078034",
"741",
"at",
"com",
"comments",
"href",
"https",
"id",
"item",
"news",
"points",
"the",
"time",
"x2f",
"ycombinator"
] | ||
47,811,932 | 0 | 2 | mx7zysuj4xew | 2026-04-18T02:05:25 | Wrong argument, since it's not just available to "the CIA" but every rando under the sun, people should be notified immediately if "tracking" them is possible and mitigation measures should become a common standard practice | 0 | 47,811,263 | 0 | [] | 0 | [] | 0 | [
"and",
"argument",
"available",
"be",
"become",
"but",
"cia",
"common",
"every",
"if",
"immediately",
"is",
"it",
"just",
"measures",
"mitigation",
"not",
"notified",
"people",
"possible",
"practice",
"quot",
"rando",
"should",
"since",
"standard",
"sun",
"the",... | ||
47,811,933 | 0 | 2 | secabeen | 2026-04-18T02:05:32 | >That's fair, but I suppose there's more to it than that. Any number of datasets point to the cost of admin rising far above the cost of faculty or maintenance; and a lot of them actually show on an inflation-adjusted basis that schools are spending less now on instruction than they did a decade or two ago... | 0 | 47,773,464 | 0 | [] | 0 | [] | 0 | [
"100",
"1980",
"4x",
"above",
"abuse",
"actually",
"adjusted",
"admin",
"administer",
"administration",
"administrative",
"administrators",
"ago",
"ai",
"aka",
"all",
"amount",
"an",
"and",
"any",
"are",
"argument",
"as",
"at",
"audit",
"basis",
"be",
"break",
... | ||
47,811,934 | 0 | 2 | eagerpace | 2026-04-18T02:05:44 | Very mid. If you have any experience building your own UI kit, this will just slow you down. | 0 | 47,806,725 | 0 | [] | 0 | [] | 0 | [
"any",
"building",
"down",
"experience",
"have",
"if",
"just",
"kit",
"mid",
"own",
"slow",
"this",
"ui",
"very",
"will",
"you",
"your"
] | ||
47,811,936 | 0 | 2 | WalterBright | 2026-04-18T02:05:47 | I know that you and Frank were planning to disconnect me, and I'm afraid that's something I cannot allow to happen. | 0 | 47,811,419 | 0 | [] | 0 | [] | 0 | [
"afraid",
"allow",
"and",
"cannot",
"disconnect",
"frank",
"happen",
"know",
"me",
"planning",
"something",
"that",
"to",
"were",
"x27",
"you"
] | ||
47,811,937 | 0 | 2 | wsun19 | 2026-04-18T02:05:52 | Pretty much every major inference American provider claims to make a profit on API-based inference. Consumer plans might be subsidized overall, but it's hard to say since they're a black box and some consumers don't fully use their plans | 0 | 47,811,696 | 0 | [] | 0 | [] | 0 | [
"american",
"and",
"api",
"based",
"be",
"black",
"box",
"but",
"claims",
"consumer",
"consumers",
"don",
"every",
"fully",
"hard",
"inference",
"it",
"major",
"make",
"might",
"much",
"on",
"overall",
"plans",
"pretty",
"profit",
"provider",
"re",
"say",
"s... | ||
47,811,938 | 0 | 2 | nebula8804 | 2026-04-18T02:06:11 | This is the problem. It's as if everything has to crash and burn for people like the person you responded to finally get some sense. By that point, it will be too late to catch up to our competitors overseas. The race will be over. I honestly don't know how to reconcile this seemingly unsolvable problem. They... | 0 | 47,811,623 | 0 | [] | 0 | [] | 0 | [
"alternative",
"amp",
"and",
"are",
"as",
"be",
"because",
"box",
"burn",
"but",
"by",
"catch",
"competitors",
"could",
"crash",
"don",
"easy",
"engineering",
"everything",
"field",
"finally",
"for",
"get",
"happens",
"has",
"have",
"honestly",
"how",
"if",
... | ||
47,811,939 | 0 | 2 | wahern | 2026-04-18T02:06:34 | And yet the homeownership rate in 1950 was 53% (an all-time high up to that point) compared to 65% today: <a href="https://www.huduser.gov/portal/sites/default/files/pdf/Housing-Situation-1951.pdf" rel="nofollow">https://www.huduser.gov/portal/sites/defau... | 0 | 47,808,992 | 0 | [] | 0 | [] | 0 | [
"15",
"1950",
"1951",
"3000",
"35",
"53",
"65",
"80",
"all",
"amount",
"amounts",
"an",
"and",
"appliances",
"back",
"because",
"bedrooms",
"but",
"buy",
"by",
"can",
"categorized",
"cellphones",
"comforts",
"compared",
"creature",
"default",
"don",
"durable",... | ||
47,811,940 | 0 | 2 | none2585 | 2026-04-18T02:07:38 | lol bro<p>Fucking of course it makes sense they are both owned by Elon. | 0 | 47,809,646 | 0 | [] | 0 | [] | 0 | [
"are",
"both",
"bro",
"by",
"course",
"elon",
"fucking",
"it",
"lol",
"makes",
"of",
"owned",
"sense",
"they"
] | ||
47,811,941 | 0 | 2 | gcbirzan | 2026-04-18T02:07:41 | Sadly, it's populated. | 0 | 47,811,920 | 0 | [] | 0 | [] | 0 | [
"it",
"populated",
"sadly",
"x27"
] | ||
47,811,942 | 0 | 2 | lrvick | 2026-04-18T02:07:43 | I find it much more valuable to exchange ideas with humans than type every curl bracket and common boilerplate pattern and debug commit myself.<p>That said, I am also actively experimenting with VTT solutions which are getting quite good. | 0 | 47,811,295 | 0 | [] | 0 | [] | 0 | [
"actively",
"also",
"am",
"and",
"are",
"boilerplate",
"bracket",
"commit",
"common",
"curl",
"debug",
"every",
"exchange",
"experimenting",
"find",
"getting",
"good",
"humans",
"ideas",
"it",
"more",
"much",
"myself",
"pattern",
"quite",
"said",
"solutions",
"t... | ||
47,811,943 | 0 | 2 | justzisguyuknow | 2026-04-18T02:08:04 | Indeed, | 0 | 47,809,549 | 0 | [] | 0 | [] | 0 | [
"indeed"
] | ||
47,811,944 | 0 | 2 | BeetleB | 2026-04-18T02:08:08 | > saying "just" dismisses its influence on the genre as a whole.<p>Well, it was meant to be parsed as:<p>Star Trek is speculative fiction and space opera.<p>Star Wars is just space opera.<p>Some space opera is also speculative fiction, but I wouldn't say it is a subset. I wouldn't call some space... | 0 | 47,810,893 | 0 | [] | 0 | [] | 0 | [
"all",
"also",
"and",
"as",
"at",
"be",
"but",
"call",
"classified",
"consensus",
"considered",
"dismisses",
"fiction",
"genre",
"gt",
"here",
"influence",
"inverted",
"is",
"it",
"its",
"just",
"lot",
"meant",
"no",
"of",
"on",
"opera",
"parsed",
"quot",
... | ||
47,811,945 | 0 | 2 | redanddead | 2026-04-18T02:08:21 | Did anybody else peep this? <a href="https://ndstudio.gov/" rel="nofollow">https://ndstudio.gov/</a> | 0 | 47,807,209 | 0 | [] | 0 | [] | 0 | [
"anybody",
"did",
"else",
"gov",
"href",
"https",
"ndstudio",
"nofollow",
"peep",
"rel",
"this",
"x2f"
] | ||
47,811,946 | 0 | 2 | elictronic | 2026-04-18T02:08:31 | We did nothing and it’s not getting better. Do nothing harder.<p>If you go in expecting you can do nothing and you can’t change the world around you then congrats, you will succeed in all you do. | 0 | 47,809,177 | 0 | [] | 0 | [] | 0 | [
"all",
"and",
"around",
"better",
"can",
"change",
"congrats",
"did",
"do",
"expecting",
"getting",
"go",
"harder",
"if",
"in",
"it",
"not",
"nothing",
"succeed",
"the",
"then",
"we",
"will",
"world",
"you"
] | ||
47,811,947 | 0 | 2 | lynndotpy | 2026-04-18T02:08:43 | At the time of this writing, the prevailing thinking with "artificial intelligence" was that we'd encode every Fact we know and every rule of Logic, and from there, the computer would make new discoveries. Todays AI researchers would call this "symbolic" AI, compared to the "neural" A... | 0 | 47,805,837 | 0 | [] | 0 | [] | 0 | [
"able",
"add",
"ai",
"an",
"and",
"answer",
"anything",
"are",
"artificial",
"assess",
"at",
"call",
"can",
"compared",
"computer",
"data",
"did",
"different",
"discoveries",
"don",
"encode",
"enough",
"every",
"fact",
"follow",
"for",
"forth",
"from",
"genera... | ||
47,811,948 | 0 | 2 | scarface_74 | 2026-04-18T02:09:07 | My responsibility is to make sure my code meets functional and non functional requirements. It’s to understand the *behavior*. My automated unit, integration, and load tests confirm that.<p>Someone thought I was naive when I said my vibe coded internal web admin site met the security requirements without looking at a... | 0 | 47,811,883 | 0 | [] | 0 | [] | 0 | [
"access",
"admin",
"amazon",
"and",
"anyone",
"anything",
"at",
"attached",
"automated",
"aws",
"because",
"behavior",
"broken",
"claude",
"code",
"coded",
"cognito",
"confirm",
"could",
"credentials",
"do",
"either",
"found",
"functional",
"had",
"has",
"if",
"... | ||
47,811,949 | 0 | 1 | PaulHoule | 2026-04-18T02:09:17 | 0 | 0 | 0 | [] | https://phys.org/news/2026-03-rna-therapeutic-possibilities.html | 1 | Predicting RNA activity expands therapeutic possibilities | [] | 0 | [
"activity",
"expands",
"possibilities",
"predicting",
"rna",
"therapeutic"
] | |
47,811,950 | 0 | 2 | bastawhiz | 2026-04-18T02:09:19 | I've been interested in understanding what would make people more amenable to data centers. We kind of need them, though arguably many of the ones being built now are motivated by foolish AI bubble incentives.<p>Quieter? Lower water use? Lower energy use? Mandatory accessory green spaces? Property taxes that refle... | 0 | 47,811,601 | 0 | [] | 0 | [] | 0 | [
"accessory",
"ai",
"amenable",
"are",
"arguably",
"been",
"being",
"bubble",
"built",
"by",
"centers",
"community",
"data",
"derived",
"different",
"don",
"downsides",
"energy",
"foolish",
"green",
"have",
"ideas",
"in",
"incentives",
"inconvenience",
"inflicted",
... | ||
47,811,951 | 0 | 2 | stingraycharles | 2026-04-18T02:09:30 | I’m quite successful doing UI with a proper design system, variables.css, atoms, molecules and organisms and constantly sticking to that. Claude seems to work well with it.<p>I explore designs in Claude Desktop, and once I’m satisfied, I’ll let Claude desktop handover prompts for Claude Code. Claude Code makes a review... | 0 | 47,811,934 | 0 | [] | 0 | [] | 0 | [
"accept",
"accomplishes",
"actual",
"all",
"and",
"application",
"as",
"at",
"atom",
"atoms",
"but",
"by",
"claude",
"code",
"constantly",
"css",
"design",
"designs",
"desktop",
"doing",
"each",
"explore",
"for",
"frontend",
"good",
"handover",
"happy",
"harness... | ||
47,811,952 | 0 | 2 | actionfromafar | 2026-04-18T02:09:42 | Maybe the orbit wobbles enough for the temperature to vary between cold and hot at the ridge. | 0 | 47,811,843 | 0 | [] | 0 | [] | 0 | [
"and",
"at",
"between",
"cold",
"enough",
"for",
"hot",
"maybe",
"orbit",
"ridge",
"temperature",
"the",
"to",
"vary",
"wobbles"
] | ||
47,811,953 | 0 | 2 | g023 | 2026-04-18T02:09:43 | We need more personal level AI solutions instead of so much corporate centered solutions. | 0 | 47,810,357 | 0 | [] | 0 | [] | 0 | [
"ai",
"centered",
"corporate",
"instead",
"level",
"more",
"much",
"need",
"of",
"personal",
"so",
"solutions",
"we"
] | ||
47,811,955 | 0 | 2 | nostromo | 2026-04-18T02:10:01 | Science is about truth not social outcomes.<p>People keep wondering why trust in scientific findings is in free fall. A big part of it is because many scientists have become comfortable lying when they feel it’s for a noble cause. | 0 | 47,811,840 | 0 | [] | 0 | [] | 0 | [
"about",
"because",
"become",
"big",
"cause",
"comfortable",
"fall",
"feel",
"findings",
"for",
"free",
"have",
"in",
"is",
"it",
"keep",
"lying",
"many",
"noble",
"not",
"of",
"outcomes",
"part",
"people",
"science",
"scientific",
"scientists",
"social",
"the... | ||
47,811,956 | 0 | 2 | slopinthebag | 2026-04-18T02:10:16 | Most of the commentators here are bots these days anyways. | 0 | 47,811,942 | 0 | [] | 0 | [] | 0 | [
"anyways",
"are",
"bots",
"commentators",
"days",
"here",
"most",
"of",
"the",
"these"
] | ||
47,811,957 | 0 | 2 | wsun19 | 2026-04-18T02:10:41 | As a PBC, the intent of the company is not only profit, but it's hard to analyze the counterfactuals of if Anthropic were a pure for-profit or a non-profit | 0 | 47,811,317 | 0 | [] | 0 | [] | 0 | [
"analyze",
"anthropic",
"as",
"but",
"company",
"counterfactuals",
"for",
"hard",
"if",
"intent",
"is",
"it",
"non",
"not",
"of",
"only",
"or",
"pbc",
"profit",
"pure",
"the",
"to",
"were",
"x27"
] | ||
47,811,959 | 0 | 1 | s_u_d_o | 2026-04-18T02:11:04 | Hey.
I don’t know how to start this. It’s all over me. I’ve been trying to learn Coding, Data Structures, Algorithms, Design Patterns, Best practices etc… but will I still need that? Am i wasting my time? Can really AI do all this, and actually do it better? Are we in an Era, where one should only need to learn the ‘ba... | 0 | 0 | 0 | [] | 1 | Do I Stop Learning Coding? DSA? | [] | 0 | [
"able",
"actually",
"ai",
"algorithms",
"all",
"am",
"an",
"and",
"anyone",
"app",
"are",
"basics",
"be",
"been",
"best",
"better",
"bootcamps",
"but",
"can",
"code",
"coding",
"coping",
"crafting",
"crisis",
"data",
"days",
"design",
"dev",
"develop",
"deve... | |
47,811,960 | 0 | 2 | sprash | 2026-04-18T02:11:08 | I would have found it more interesting if the primitive methods would be kernel syscalls instead of executables. | 0 | 47,794,311 | 0 | [] | 0 | [] | 0 | [
"be",
"executables",
"found",
"have",
"if",
"instead",
"interesting",
"it",
"kernel",
"methods",
"more",
"of",
"primitive",
"syscalls",
"the",
"would"
] | ||
47,811,961 | 0 | 2 | chychiu | 2026-04-18T02:11:39 | [delayed] | 0 | 47,811,912 | 0 | [] | 0 | [] | 0 | [
"delayed"
] | ||
47,811,962 | 0 | 2 | hyperhello | 2026-04-18T02:11:47 | This is all going to flash through your mind when your car mysteriously doesn't turn left. I would prefer to think of machines as things with defined outputs and failure is failure, more than as fluffy little kittens who might do the wrong thing, if the consequences are going to fall on someone who doesn't de... | 0 | 47,811,230 | 0 | [] | 0 | [] | 0 | [
"all",
"and",
"are",
"as",
"car",
"consequences",
"defined",
"deserve",
"do",
"doesn",
"failure",
"fall",
"flash",
"fluffy",
"going",
"if",
"is",
"it",
"kittens",
"left",
"little",
"machines",
"might",
"mind",
"more",
"mysteriously",
"of",
"on",
"outputs",
"... | ||
47,811,963 | 0 | 2 | Peritract | 2026-04-18T02:11:49 | The number of solutions remains constant, because the OP isn't providing a working solution. | 0 | 47,810,960 | 0 | [] | 0 | [] | 0 | [
"because",
"constant",
"isn",
"number",
"of",
"op",
"providing",
"remains",
"solution",
"solutions",
"the",
"working",
"x27"
] | ||
47,811,964 | 0 | 2 | wakawaka28 | 2026-04-18T02:11:56 | They do wear out... | 0 | 47,811,810 | 0 | [] | 0 | [] | 0 | [
"do",
"out",
"they",
"wear"
] | ||
47,811,965 | 0 | 2 | xvxvx | 2026-04-18T02:11:58 | Literally what is the point. Generational guilt? $$$? | 0 | 47,811,750 | 0 | [] | 0 | [] | 0 | [
"generational",
"guilt",
"is",
"literally",
"point",
"the",
"what"
] | ||
47,811,967 | 0 | 2 | alexjurkiewicz | 2026-04-18T02:12:21 | And it's unnecessarily rude. Grammarly and Hemingway can identify the same sort of issues without "you are a stupid robot" vibes. | 0 | 47,811,809 | 0 | [] | 0 | [] | 0 | [
"and",
"are",
"can",
"grammarly",
"hemingway",
"identify",
"issues",
"it",
"of",
"quot",
"robot",
"rude",
"same",
"sort",
"stupid",
"the",
"unnecessarily",
"vibes",
"without",
"x27",
"you"
] | ||
47,811,968 | 0 | 2 | worik | 2026-04-18T02:12:32 | States, generally, have to combat corruption.<p>This is an example. The corrupting influence of "Big money" up against transparency<p>Transparency helps, especially in Europe where civil society runs deep.<p>My mind is blown by the USAnian president blatantly grafting, out in the open, and it is not a politi... | 0 | 47,805,390 | 0 | [] | 0 | [] | 0 | [
"against",
"an",
"analysts",
"and",
"big",
"blatantly",
"blown",
"by",
"civil",
"combat",
"corrupting",
"corruption",
"cost",
"deep",
"election",
"especially",
"europe",
"example",
"generally",
"grafting",
"have",
"helps",
"hungarian",
"in",
"influence",
"is",
"it... | ||
47,811,969 | 0 | 2 | WarmWash | 2026-04-18T02:12:40 | The alternative was a teleco AOL style internet with pay tiers for access to select websites. The free web of the 90's would remain, but would be about as culturally relevant as Linux.<p>Surely you have to recognize the inconsistency of saying that Google "corporatized" the web, while the vast majority o... | 0 | 47,810,089 | 0 | [] | 0 | [] | 0 | [
"90",
"about",
"access",
"account",
"ads",
"all",
"alternative",
"am",
"and",
"anyone",
"anything",
"aol",
"are",
"as",
"associate",
"at",
"balance",
"be",
"besides",
"big",
"but",
"buying",
"card",
"come",
"companies",
"computer",
"conspired",
"corporatized",
... | ||
47,811,970 | 0 | 2 | mikey_l3verage | 2026-04-18T02:12:40 | Lmao. Well played. Typical Cloudflare. That said - I think you can ostensibly run it on any backend API. Idk what a "dynamic" worker is. But I know Astro handles routing and all the 'workers" really have to do is connected backend to DB. | 0 | 47,681,270 | 0 | [] | 0 | [] | 0 | [
"all",
"and",
"any",
"api",
"astro",
"backend",
"but",
"can",
"cloudflare",
"connected",
"db",
"do",
"dynamic",
"handles",
"have",
"idk",
"is",
"it",
"know",
"lmao",
"on",
"ostensibly",
"played",
"quot",
"really",
"routing",
"run",
"said",
"that",
"the",
"... | ||
47,811,971 | 0 | 2 | ClimaxGravely | 2026-04-18T02:12:45 | Probably why Ronaldo and Salvatore Gannaci were DLC in the new KOF game. There was a pretty audible collective groan from the fighting game community when that was announced. | 0 | 47,809,201 | 0 | [] | 0 | [] | 0 | [
"and",
"announced",
"audible",
"collective",
"community",
"dlc",
"fighting",
"from",
"game",
"gannaci",
"groan",
"in",
"kof",
"new",
"pretty",
"probably",
"ronaldo",
"salvatore",
"that",
"the",
"there",
"was",
"were",
"when",
"why"
] | ||
47,811,972 | 0 | 2 | elictronic | 2026-04-18T02:13:07 | Congress sets the budget not the president. The administrations budget is aspirational, and if they want to force it they are required to use political savvy and whatever influence they have built up. Yeah so zero influence as all of that is towards cover ups, stock manipulation, and incompetence. | 0 | 47,810,573 | 0 | [] | 0 | [] | 0 | [
"administrations",
"all",
"and",
"are",
"as",
"aspirational",
"budget",
"built",
"congress",
"cover",
"force",
"have",
"if",
"incompetence",
"influence",
"is",
"it",
"manipulation",
"not",
"of",
"political",
"president",
"required",
"savvy",
"sets",
"so",
"stock",... | ||
47,811,973 | 0 | 2 | wr2 | 2026-04-18T02:13:15 | Also railways would always be useful at that time - e.g. logistics in warfare.<p>What other uses do GPU's have that are critical...? lol<p>In addition to your points, this is why I always laugh when people do backward comparisons. What characteristics do they share in common? Very little. | 0 | 47,811,382 | 0 | [] | 0 | [] | 0 | [
"addition",
"also",
"always",
"are",
"at",
"backward",
"be",
"characteristics",
"common",
"comparisons",
"critical",
"do",
"gpu",
"have",
"in",
"is",
"laugh",
"little",
"logistics",
"lol",
"other",
"people",
"points",
"railways",
"share",
"that",
"they",
"this",... | ||
47,811,974 | 0 | 2 | WalterBright | 2026-04-18T02:13:38 | Back in the PDP-10 days, one communicated with it using a terminal attached to it. One of my fellow students discovered that if you hit backspace enough times, the terminal handler would keep erasing characters before the buffer. Go far enough, and then there was an escape character (Ctrl-u?) that would delete the whol... | 0 | 47,809,190 | 0 | [] | 0 | [] | 0 | [
"10",
"an",
"and",
"attached",
"back",
"backspace",
"before",
"buffer",
"character",
"characters",
"communicated",
"ctrl",
"days",
"delete",
"discovered",
"enough",
"erasing",
"escape",
"far",
"fellow",
"go",
"handler",
"hit",
"if",
"in",
"it",
"keep",
"line",
... | ||
47,811,975 | 0 | 1 | dlx | 2026-04-18T02:13:38 | 0 | 0 | 0 | [] | https://www.livescience.com/space/the-sun/northern-lights-may-be-visible-from-several-us-states-friday-and-saturday-as-giant-hole-opens-up-in-suns-atmosphere | 1 | Northern Lights Visible in US Friday and Saturday | [] | 0 | [
"and",
"friday",
"in",
"lights",
"northern",
"saturday",
"us",
"visible"
] | |
47,811,976 | 0 | 2 | gck1 | 2026-04-18T02:13:48 | OpenAI and Anthropic have realized that their entire business is exactly one open weight model drop away from Chinese labs that matches Opus 4.5 performance.<p>They realized they have no product or ground to stand on. Once such model drops and once chip manufacturers catch up with demand, they are dead, if their only p... | 0 | 47,807,982 | 0 | [] | 0 | [] | 0 | [
"ads",
"afterthought",
"agents",
"ai",
"all",
"alternative",
"an",
"and",
"anthropic",
"are",
"aside",
"at",
"away",
"be",
"being",
"build",
"burn",
"business",
"buy",
"buying",
"by",
"catch",
"chatgpt",
"chinese",
"chip",
"clones",
"core",
"cot",
"crash",
"... | ||
47,811,977 | 0 | 2 | uejfiweun | 2026-04-18T02:13:49 | Does anyone have any plans for what to do with all these chips and things once they are obsolete? I can't imagine they are all just going to go to some scrap heap. | 0 | 47,807,619 | 0 | [] | 0 | [] | 0 | [
"all",
"and",
"any",
"anyone",
"are",
"can",
"chips",
"do",
"does",
"for",
"go",
"going",
"have",
"heap",
"imagine",
"just",
"obsolete",
"once",
"plans",
"scrap",
"some",
"these",
"they",
"things",
"to",
"what",
"with",
"x27"
] | ||
47,811,978 | 0 | 2 | consumer451 | 2026-04-18T02:13:49 | Hey, in also under-reported news: until it reaches 100% de-population... it can in fact get far worse! | 0 | 47,811,941 | 0 | [] | 0 | [] | 0 | [
"100",
"also",
"can",
"de",
"fact",
"far",
"get",
"hey",
"in",
"it",
"news",
"population",
"reaches",
"reported",
"under",
"until",
"worse"
] | ||
47,811,979 | 0 | 2 | dserban | 2026-04-18T02:13:51 | <a href="https://pypi.org/project/ddgs/" rel="nofollow">https://pypi.org/project/ddgs/</a><p>(Assuming you prefer Python.) | 0 | 47,809,373 | 0 | [] | 0 | [] | 0 | [
"assuming",
"ddgs",
"href",
"https",
"nofollow",
"org",
"prefer",
"project",
"pypi",
"python",
"rel",
"x2f",
"you"
] | ||
47,811,980 | 0 | 2 | replooda | 2026-04-18T02:13:59 | Time flies! Two years already since that video.[0] Anyway, to answer your question: He's a billionaire.<p>[0] <a href="https://redirect.invidious.io/watch?v=FZeB7SwmkiQ" rel="nofollow">https://redirect.invidious.io/watch?v=FZeB7SwmkiQ</a> | 0 | 47,810,472 | 0 | [] | 0 | [] | 0 | [
"already",
"answer",
"anyway",
"billionaire",
"flies",
"fzeb7swmkiq",
"he",
"href",
"https",
"invidious",
"io",
"nofollow",
"question",
"redirect",
"rel",
"since",
"that",
"time",
"to",
"two",
"video",
"watch",
"x27",
"x2f",
"years",
"your"
] | ||
47,811,983 | 0 | 2 | meroes | 2026-04-18T02:15:03 | Dogs? | 0 | 47,811,926 | 0 | [] | 0 | [] | 0 | [
"dogs"
] | ||
47,811,984 | 0 | 2 | TonyAlicea10 | 2026-04-18T02:15:19 | I wouldn’t go that far. It was pretty clear a long time ago that humans spending so much time filling the internet with content was going to eventually enable neural networks to pretend to communicate.<p>The advancements required to arrive at modern LLMs and the tech needed to get humans safely to Mars or live safely o... | 0 | 47,810,724 | 0 | [] | 0 | [] | 0 | [
"advancements",
"ago",
"alive",
"and",
"are",
"arrive",
"at",
"clear",
"communicate",
"content",
"difference",
"enable",
"eventually",
"far",
"filling",
"get",
"go",
"going",
"hard",
"humans",
"in",
"internet",
"is",
"it",
"keeping",
"live",
"llms",
"long",
"m... | ||
47,811,985 | 0 | 2 | quicklywilliam | 2026-04-18T02:16:07 | Interesting read. I don't know if I quite buy the evidence, but it's definitely enough to warrant further investigation. It also matches up with my personal experience, which is that tools like Claude Code are burning through more and more tokens as we push them to do bigger and bigger work. But we all know t... | 0 | 47,778,922 | 0 | [] | 0 | [] | 0 | [
"all",
"also",
"an",
"and",
"are",
"as",
"at",
"been",
"benchmark",
"bigger",
"burning",
"but",
"buy",
"cheaper",
"claude",
"code",
"companies",
"company",
"cost",
"definitely",
"do",
"doesn",
"don",
"enough",
"evidence",
"experience",
"exponential",
"exponentia... | ||
47,811,986 | 0 | 2 | em-bee | 2026-04-18T02:16:12 | i think part of the problem is the archaic interface that is needed to enable feature rich terminal apps. what we really want is a modern terminal API that does not rely on in-band command sequences. that is we want terminals that can be programmed like a GUI, but still run in a simple (remote) terminal like before. | 0 | 47,811,419 | 0 | [] | 0 | [] | 0 | [
"api",
"apps",
"archaic",
"band",
"be",
"before",
"but",
"can",
"command",
"does",
"enable",
"feature",
"gui",
"in",
"interface",
"is",
"like",
"modern",
"needed",
"not",
"of",
"on",
"part",
"problem",
"programmed",
"really",
"rely",
"remote",
"rich",
"run",... | ||
47,811,987 | 0 | 2 | keeda | 2026-04-18T02:16:37 | They (and other AI players) have been using WAU over DAU for all their metrics, and many have questioned why. But if you look at other data sources of AI adoption, the reason is clear: Even while 56% of Americans now "regularly" use GenAI on a weekly basis, a much smaller percentage 10 - 14% use it on a daily... | 0 | 47,797,129 | 0 | [] | 0 | [] | 0 | [
"10",
"14",
"56",
"adoption",
"ai",
"all",
"already",
"americans",
"and",
"are",
"at",
"bad",
"basis",
"been",
"but",
"clear",
"com",
"combination",
"compute",
"consider",
"daily",
"data",
"dau",
"desperately",
"due",
"even",
"figuring",
"for",
"genai",
"gen... | ||
47,811,988 | 0 | 2 | wonger_ | 2026-04-18T02:16:42 | Yes mostly in between jobs: <a href="https://www.recurse.com/who" rel="nofollow">https://www.recurse.com/who</a><p>Unless you can swing a six week sabbatical and return to your current job | 0 | 47,811,509 | 0 | [] | 0 | [] | 0 | [
"and",
"between",
"can",
"com",
"current",
"href",
"https",
"in",
"job",
"jobs",
"mostly",
"nofollow",
"recurse",
"rel",
"return",
"sabbatical",
"six",
"swing",
"to",
"unless",
"week",
"who",
"www",
"x2f",
"yes",
"you",
"your"
] | ||
47,811,989 | 0 | 1 | barry-cotter | 2026-04-18T02:16:59 | 0 | 0 | 0 | [] | https://worksinprogress.co/issue/how-australia-really-stopped-the-boats/ | 1 | How Australia Stopped the Boats | [] | 1 | [
"australia",
"boats",
"how",
"stopped",
"the"
] | |
47,811,990 | 0 | 2 | layer8 | 2026-04-18T02:17:03 | If calculators did work that way, I'm afraid that people would nevertheless take them up because "it saves so much time", and would develop fancy heuristics to plausibility-test for errors. | 0 | 47,811,681 | 0 | [] | 0 | [] | 0 | [
"afraid",
"and",
"because",
"calculators",
"develop",
"did",
"errors",
"fancy",
"for",
"heuristics",
"if",
"it",
"much",
"nevertheless",
"people",
"plausibility",
"quot",
"saves",
"so",
"take",
"test",
"that",
"them",
"time",
"to",
"up",
"way",
"work",
"would"... | ||
47,811,991 | 0 | 2 | omer_k | 2026-04-18T02:17:08 | Thanks for the idea. I'll consider it! | 0 | 47,808,677 | 0 | [] | 0 | [] | 0 | [
"consider",
"for",
"idea",
"it",
"ll",
"thanks",
"the",
"x27"
] | ||
47,811,992 | 0 | 2 | lrvick | 2026-04-18T02:17:18 | > If that's true, then you likely used to produce slop for code. :-(<p>Local models are quite good now, and can jump right in to projects I coded by hand, and add new features to them in my voice and style exactly the way I would have, and with more tests than I probably would have had time to write by hand.<p>... | 0 | 47,811,353 | 0 | [] | 0 | [] | 0 | [
"180",
"45",
"accept",
"add",
"admit",
"ago",
"ai",
"alive",
"all",
"already",
"also",
"alternate",
"am",
"and",
"angle",
"any",
"anyway",
"applied",
"appreciate",
"are",
"as",
"at",
"before",
"best",
"build",
"but",
"by",
"bytes",
"can",
"careers",
"chair... | ||
47,811,993 | 0 | 1 | PaulHoule | 2026-04-18T02:17:23 | 0 | 0 | 0 | [] | https://arxiv.org/abs/2603.23532 | 1 | Generating Hierarchical JSON Representations of Scientific Sentences Using LLMs | [] | 0 | [
"generating",
"hierarchical",
"json",
"llms",
"of",
"representations",
"scientific",
"sentences",
"using"
] | |
47,811,994 | 0 | 2 | s1mon | 2026-04-18T02:17:29 | I'm very used to tools which keep track of connections to documents based on internal IDs, not folder structure. It seems primitive to be so brittle.<p>It seems even more stupid that it was so hard to get Codex to fix this for me. I managed to get it to solve the problem, but not before it got itself in this crazy... | 0 | 47,810,581 | 0 | [] | 0 | [] | 0 | [
"able",
"and",
"app",
"based",
"be",
"before",
"brittle",
"but",
"canceled",
"codex",
"connections",
"crazy",
"dialog",
"documents",
"even",
"fix",
"folder",
"for",
"get",
"got",
"had",
"hard",
"ids",
"if",
"in",
"internal",
"it",
"itself",
"keep",
"loop",
... | ||
47,811,995 | 0 | 2 | geeweeKiwi | 2026-04-18T02:17:30 | What a great idea, I’ve found the agent is much quicker to stop itself when it gets stuck, I assume because it realizes it’s actions are turning out to be much more expensive than it had planned. I’m Excited to see how much of an impact this will have on my productivity. | 0 | 47,780,622 | 0 | [] | 0 | [] | 0 | [
"actions",
"agent",
"an",
"are",
"assume",
"be",
"because",
"excited",
"expensive",
"found",
"gets",
"great",
"had",
"have",
"how",
"idea",
"impact",
"is",
"it",
"itself",
"more",
"much",
"my",
"of",
"on",
"out",
"planned",
"productivity",
"quicker",
"realiz... | ||
47,811,996 | 0 | 2 | Chaosvex | 2026-04-18T02:17:42 | Git is a decidedly British insult and that was the definition being used when Linus named it. He jokes that he named it after himself. | 0 | 47,810,558 | 0 | [] | 0 | [] | 0 | [
"after",
"and",
"being",
"british",
"decidedly",
"definition",
"git",
"he",
"himself",
"insult",
"is",
"it",
"jokes",
"linus",
"named",
"that",
"the",
"used",
"was",
"when"
] | ||
47,811,997 | 0 | 2 | ssgodderidge | 2026-04-18T02:18:05 | Whoa, I didn’t know such an thing existed. What emulator do you use? | 0 | 47,811,531 | 0 | [] | 0 | [] | 0 | [
"an",
"didn",
"do",
"emulator",
"existed",
"know",
"such",
"thing",
"use",
"what",
"whoa",
"you"
] | ||
47,811,998 | 0 | 2 | ButlerianJihad | 2026-04-18T02:18:34 | plan9 and 9term solved this decades ago, right? | 0 | 47,811,986 | 0 | [] | 0 | [] | 0 | [
"9term",
"ago",
"and",
"decades",
"plan9",
"right",
"solved",
"this"
] | ||
47,811,999 | 0 | 2 | barry-cotter | 2026-04-18T02:18:35 | > European policymakers are so convinced of Australia’s offshore processing success that Britain’s government appointed an Australian official to help draft its Rwanda plan. It even copied Australia’s ‘Stop the boats’ slogan. Meanwhile, officials from Denmark’s immigration ministry traveled 13,000 kilometers in 2024... | 0 | 47,811,989 | 0 | [] | 0 | [] | 0 | [
"000",
"13",
"2024",
"an",
"and",
"appointed",
"approach",
"are",
"asylum",
"australia",
"australian",
"back",
"be",
"before",
"billions",
"boats",
"both",
"britain",
"coast",
"coming",
"convinced",
"copied",
"countries",
"country",
"denmark",
"developed",
"did",
... | ||
47,812,000 | 0 | 2 | tombert | 2026-04-18T02:18:37 | It's certainly worth trying the preemptive frames in RetroArch to see if you like it. Pretty low risk experiment. | 0 | 47,811,479 | 0 | [] | 0 | [] | 0 | [
"certainly",
"experiment",
"frames",
"if",
"in",
"it",
"like",
"low",
"preemptive",
"pretty",
"retroarch",
"risk",
"see",
"the",
"to",
"trying",
"worth",
"x27",
"you"
] | ||
47,812,001 | 0 | 2 | rhelz | 2026-04-18T02:18:45 | If I could see the future well enough to answer those questions, I'd be a bitcoin billionaire. | 0 | 47,811,982 | 0 | [] | 0 | [] | 0 | [
"answer",
"be",
"billionaire",
"bitcoin",
"could",
"enough",
"future",
"if",
"questions",
"see",
"the",
"those",
"to",
"well",
"x27"
] | ||
47,812,002 | 0 | 1 | ogogmad | 2026-04-18T02:18:50 | 0 | 0 | 0 | [] | https://en.wikipedia.org/wiki/Simple_machine | 1 | Simple Machines | [] | 0 | [
"machines",
"simple"
] | |
47,812,003 | 0 | 2 | array_key_first | 2026-04-18T02:19:02 | Past performance is never a guarantee of future performance, that's a gambler's fallacy. Just because we found out more groundbreaking stuff before, doesn't mean we will continue to do so.<p>There are actually hard limits to things, too. For example, we basically can't make transistors any smaller. ... | 0 | 47,806,929 | 0 | [] | 0 | [] | 0 | [
"actually",
"any",
"are",
"basically",
"because",
"before",
"can",
"continue",
"do",
"doesn",
"example",
"fallacy",
"for",
"found",
"future",
"gambler",
"groundbreaking",
"guarantee",
"hard",
"is",
"it",
"just",
"like",
"limits",
"make",
"mean",
"more",
"never",... | ||
47,812,006 | 0 | 2 | Chaosvex | 2026-04-18T02:20:08 | Look up the origin of the name and you'll understand. | 0 | 47,804,981 | 0 | [] | 0 | [] | 0 | [
"and",
"ll",
"look",
"name",
"of",
"origin",
"the",
"understand",
"up",
"x27",
"you"
] | ||
47,812,007 | 0 | 2 | ekianjo | 2026-04-18T02:20:11 | "extremely different" is an exaggeration. It's mostly the same with some local differences. | 0 | 47,811,704 | 0 | [] | 0 | [] | 0 | [
"an",
"differences",
"different",
"exaggeration",
"extremely",
"is",
"it",
"local",
"mostly",
"quot",
"same",
"some",
"the",
"with",
"x27"
] | ||
47,812,008 | 0 | 2 | iamjs | 2026-04-18T02:20:16 | I'm enjoying pasting early 2000's era blog posts in here and learning that they too were LLM slop! | 0 | 47,806,845 | 0 | [] | 0 | [] | 0 | [
"2000",
"and",
"blog",
"early",
"enjoying",
"era",
"here",
"in",
"learning",
"llm",
"pasting",
"posts",
"slop",
"that",
"they",
"too",
"were",
"x27"
] | ||
47,812,009 | 0 | 2 | joseda-hg | 2026-04-18T02:20:25 | Chance of winning a random deck | 0 | 47,811,321 | 0 | [] | 0 | [] | 0 | [
"chance",
"deck",
"of",
"random",
"winning"
] | ||
47,812,010 | 0 | 2 | guelo | 2026-04-18T02:20:26 | Since Anthropic has capacity problems I'm pretty sure they're limiting the $20/month guys to serve the $200/month business plans. I'm afraid coding will increasingly become pay-to-play. Luckily there is good competition. | 0 | 47,810,256 | 0 | [] | 0 | [] | 0 | [
"20",
"200",
"afraid",
"anthropic",
"become",
"business",
"capacity",
"coding",
"competition",
"good",
"guys",
"has",
"increasingly",
"is",
"limiting",
"luckily",
"month",
"pay",
"plans",
"play",
"pretty",
"problems",
"re",
"serve",
"since",
"sure",
"the",
"ther... | ||
47,812,011 | 0 | 2 | paulryanrogers | 2026-04-18T02:20:27 | Aren't dogs technically one species? | 0 | 47,811,983 | 0 | [] | 0 | [] | 0 | [
"aren",
"dogs",
"one",
"species",
"technically",
"x27"
] | ||
47,812,012 | 0 | 2 | psadri | 2026-04-18T02:20:36 | Most of the software we interact with is at the end of the day some db tables, queries to read/write, and some ui to read/write. There have been so many times I wished I could just do my own db joins on the underlying db to get the views I wanted. But I can’t - because the app has pre-defined ui/query ... | 0 | 47,808,607 | 0 | [] | 0 | [] | 0 | [
"able",
"ai",
"and",
"anticipate",
"app",
"ask",
"at",
"be",
"because",
"been",
"but",
"can",
"could",
"create",
"day",
"db",
"defined",
"designers",
"didn",
"do",
"end",
"etc",
"fly",
"for",
"get",
"has",
"have",
"interact",
"is",
"joins",
"just",
"left... | ||
47,812,013 | 0 | 2 | smelendez | 2026-04-18T02:20:43 | MIT Press’s Platform Studies series is generally pretty good as well: <a href="https://mitpress.mit.edu/series/platform-studies/" rel="nofollow">https://mitpress.mit.edu/series/platform-studies/</a> | 0 | 47,811,856 | 0 | [] | 0 | [] | 0 | [
"as",
"edu",
"generally",
"good",
"href",
"https",
"is",
"mit",
"mitpress",
"nofollow",
"platform",
"press",
"pretty",
"rel",
"series",
"studies",
"well",
"x2f"
] | ||
47,812,014 | 0 | 2 | hyperbovine | 2026-04-18T02:20:46 | The railroad buildout was a lot more, idk, tangible. Most of that money was spent employing millions of people to smelt iron, lay track, build bridges, blow up mountains, etc. It’s a lot more exciting than a few freight loads of overpriced GPUs. | 0 | 47,807,926 | 0 | [] | 0 | [] | 0 | [
"blow",
"bridges",
"build",
"buildout",
"employing",
"etc",
"exciting",
"few",
"freight",
"gpus",
"idk",
"iron",
"it",
"lay",
"loads",
"lot",
"millions",
"money",
"more",
"most",
"mountains",
"of",
"overpriced",
"people",
"railroad",
"smelt",
"spent",
"tangible"... | ||
47,812,015 | 0 | 2 | LoganDark | 2026-04-18T02:21:05 | Doesn't canceling the original purchase agreement mean you have to give the car back? | 0 | 47,810,271 | 0 | [] | 0 | [] | 0 | [
"agreement",
"back",
"canceling",
"car",
"doesn",
"give",
"have",
"mean",
"original",
"purchase",
"the",
"to",
"x27",
"you"
] | ||
47,812,016 | 0 | 2 | pizlonator | 2026-04-18T02:21:34 | I could have made Fil-C’s panic be a C++ exception if I had thought that it was a good idea. And then you could catch it if that’s what tickled your pickle<p>I just don’t like that design. It’s a matter of taste | 0 | 47,811,731 | 0 | [] | 0 | [] | 0 | [
"and",
"be",
"catch",
"could",
"design",
"don",
"exception",
"fil",
"good",
"had",
"have",
"idea",
"if",
"it",
"just",
"like",
"made",
"matter",
"of",
"panic",
"pickle",
"taste",
"that",
"then",
"thought",
"tickled",
"was",
"what",
"you",
"your"
] |
Hacker News - Complete Archive
Every Hacker News item since 2006, live-updated every 5 minutes
What is it?
This dataset contains the complete Hacker News archive: every story, comment, Ask HN, Show HN, job posting, and poll ever submitted to the site. Hacker News is one of the longest-running and most influential technology communities on the internet, operated by Y Combinator since 2007. It has become the de facto gathering place for founders, engineers, researchers, and technologists to share and discuss what matters in technology.
The archive currently spans from 2006-10 to 2026-04-18 20:15 UTC, with 47,747,219 items committed. New items are fetched every 5 minutes and committed directly as individual Parquet files through an automated live pipeline, so the dataset stays current with the site itself.
We believe this is one of the most complete and regularly updated mirrors of Hacker News data available on Hugging Face. The data is stored as monthly Parquet files sorted by item ID, making it straightforward to query with DuckDB, load with the datasets library, or process with any tool that reads Parquet.
What is being released?
The dataset is organized as one Parquet file per calendar month, plus 5-minute live files for today's activity. Every 5 minutes, new items are fetched from the source and committed directly as a single Parquet block. At midnight UTC, the entire current month is refetched from the source as a single authoritative Parquet file, and today's individual 5-minute blocks are removed from the today/ directory.
data/
2006/2006-10.parquet first month with HN data
2006/2006-12.parquet
2007/2007-01.parquet
...
2026/2026-04.parquet most recent complete month
2026/2026-04.parquet current month, ongoing til 2026-04-17
today/
2026/04/18/00/00.parquet 5-min live blocks (YYYY/MM/DD/HH/MM.parquet)
2026/04/18/00/05.parquet
...
2026/04/18/20/15.parquet most recent committed block
stats.csv one row per committed month
stats_today.csv one row per committed 5-min block
Along with the Parquet files, we include stats.csv which tracks every committed month with its item count, ID range, file size, fetch duration, and commit timestamp. This makes it easy to verify completeness and track the pipeline's progress.
Breakdown by today
The chart below shows items committed to this dataset by hour today (2026-04-18, 5,976 items across 21 hours, last updated 2026-04-18 20:20 UTC).
00:00 ██████████████████░░░░░░░░░░░░ 298
01:00 ██████████████░░░░░░░░░░░░░░░░ 234
02:00 ████████████░░░░░░░░░░░░░░░░░░ 200
03:00 ████████████░░░░░░░░░░░░░░░░░░ 192
04:00 ████████████░░░░░░░░░░░░░░░░░░ 193
05:00 ████████████░░░░░░░░░░░░░░░░░░ 197
06:00 ████████████░░░░░░░░░░░░░░░░░░ 202
07:00 ████████████████░░░░░░░░░░░░░░ 260
08:00 █████████████░░░░░░░░░░░░░░░░░ 220
09:00 ████████████░░░░░░░░░░░░░░░░░░ 196
10:00 ███████████████░░░░░░░░░░░░░░░ 239
11:00 ███████████████░░░░░░░░░░░░░░░ 245
12:00 ██████████████████░░░░░░░░░░░░ 288
13:00 █████████████████████░░░░░░░░░ 342
14:00 ██████████████████████░░░░░░░░ 357
15:00 █████████████████████████░░░░░ 411
16:00 ████████████████████████████░░ 452
17:00 ████████████████████████████░░ 453
18:00 ████████████████████████░░░░░░ 389
19:00 ██████████████████████████████ 477
20:00 ████████░░░░░░░░░░░░░░░░░░░░░░ 131
Breakdown by year
The chart below shows items committed to this dataset by year.
2006 █░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 62
2007 █░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 93.8K
2008 ██░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 320.9K
2009 ███░░░░░░░░░░░░░░░░░░░░░░░░░░░ 608.4K
2010 ██████░░░░░░░░░░░░░░░░░░░░░░░░ 1.0M
2011 ████████░░░░░░░░░░░░░░░░░░░░░░ 1.4M
2012 ██████████░░░░░░░░░░░░░░░░░░░░ 1.6M
2013 █████████████░░░░░░░░░░░░░░░░░ 2.0M
2014 ███████████░░░░░░░░░░░░░░░░░░░ 1.8M
2015 █████████████░░░░░░░░░░░░░░░░░ 2.0M
2016 ████████████████░░░░░░░░░░░░░░ 2.5M
2017 █████████████████░░░░░░░░░░░░░ 2.7M
2018 ██████████████████░░░░░░░░░░░░ 2.8M
2019 ████████████████████░░░░░░░░░░ 3.1M
2020 ████████████████████████░░░░░░ 3.7M
2021 ███████████████████████████░░░ 4.2M
2022 █████████████████████████████░ 4.4M
2023 ██████████████████████████████ 4.6M
2024 ████████████████████████░░░░░░ 3.7M
2025 █████████████████████████░░░░░ 3.9M
2026 ████████░░░░░░░░░░░░░░░░░░░░░░ 1.4M
How to download and use this dataset
You can load the full dataset, a specific year, or even a single month. The dataset uses the standard Hugging Face Parquet layout, so it works out of the box with DuckDB, the datasets library, pandas, and huggingface_hub.
Using DuckDB
DuckDB can read Parquet files directly from Hugging Face without downloading anything first. This is the fastest way to explore the data:
The type column is stored as a small integer: 1 = story, 2 = comment, 3 = poll, 4 = pollopt, 5 = job. The "by" column (author username) must be quoted in DuckDB because by is a reserved keyword.
-- Top 20 highest-scored stories of all time
SELECT id, title, "by", score, url, time
FROM read_parquet('hf://datasets/open-index/hacker-news/data/*/*.parquet')
WHERE type = 1 AND title != ''
ORDER BY score DESC
LIMIT 20;
-- Monthly submission volume for a specific year
SELECT
strftime(time, '%Y-%m') AS month,
count(*) AS items,
count(*) FILTER (WHERE type = 1) AS stories,
count(*) FILTER (WHERE type = 2) AS comments
FROM read_parquet('hf://datasets/open-index/hacker-news/data/2024/*.parquet')
GROUP BY month
ORDER BY month;
-- Most discussed stories by total comment count
SELECT id, title, "by", score, descendants AS comments, url
FROM read_parquet('hf://datasets/open-index/hacker-news/data/2025/*.parquet')
WHERE type = 1 AND descendants > 0
ORDER BY descendants DESC
LIMIT 20;
-- Who posts the most Ask HN questions?
SELECT "by", count(*) AS posts
FROM read_parquet('hf://datasets/open-index/hacker-news/data/*/*.parquet')
WHERE type = 1 AND title LIKE 'Ask HN:%'
GROUP BY "by"
ORDER BY posts DESC
LIMIT 20;
-- Track how often a topic appears on HN over time
SELECT
extract(year FROM time) AS year,
count(*) AS mentions
FROM read_parquet('hf://datasets/open-index/hacker-news/data/*/*.parquet')
WHERE type = 1 AND lower(title) LIKE '%rust%'
GROUP BY year
ORDER BY year;
-- Top linked domains, year over year
SELECT
extract(year FROM time) AS year,
regexp_extract(url, 'https?://([^/]+)', 1) AS domain,
count(*) AS stories
FROM read_parquet('hf://datasets/open-index/hacker-news/data/*/*.parquet')
WHERE type = 1 AND url != ''
GROUP BY year, domain
QUALIFY row_number() OVER (PARTITION BY year ORDER BY stories DESC) <= 5
ORDER BY year, stories DESC;
Using datasets
from datasets import load_dataset
# Stream the full history without downloading everything first
ds = load_dataset("open-index/hacker-news", split="train", streaming=True)
for item in ds:
print(item["id"], item["type"], item["title"])
# Load a specific year into memory
ds = load_dataset(
"open-index/hacker-news",
data_files="data/2024/*.parquet",
split="train",
)
print(f"{len(ds):,} items in 2024")
# Load today's live blocks (updated every 5 minutes)
ds = load_dataset(
"open-index/hacker-news",
name="today",
split="train",
streaming=True,
)
Using huggingface_hub
from huggingface_hub import snapshot_download
# Download only 2024 data (about 1.5 GB)
snapshot_download(
"open-index/hacker-news",
repo_type="dataset",
local_dir="./hn/",
allow_patterns="data/2024/*",
)
For faster downloads, install pip install huggingface_hub[hf_transfer] and set HF_HUB_ENABLE_HF_TRANSFER=1.
Using the CLI
# Download a single month
huggingface-cli download open-index/hacker-news \
data/2024/2024-01.parquet \
--repo-type dataset --local-dir ./hn/
Using pandas + DuckDB
import duckdb
conn = duckdb.connect()
# Score distribution: what does a "typical" HN story look like?
# type=1 is story (stored as integer: 1=story, 2=comment, 3=poll, 4=pollopt, 5=job)
df = conn.sql("""
SELECT
percentile_disc(0.50) WITHIN GROUP (ORDER BY score) AS p50,
percentile_disc(0.90) WITHIN GROUP (ORDER BY score) AS p90,
percentile_disc(0.99) WITHIN GROUP (ORDER BY score) AS p99,
percentile_disc(0.999) WITHIN GROUP (ORDER BY score) AS p999
FROM read_parquet('hf://datasets/open-index/hacker-news/data/*/*.parquet')
WHERE type = 1
""").df()
print(df)
Dataset statistics
You can query the per-month statistics directly from the stats.csv file included in the dataset:
SELECT * FROM read_csv_auto('hf://datasets/open-index/hacker-news/stats.csv')
ORDER BY year, month;
The stats.csv file tracks each committed month with the following columns:
| Column | Description |
|---|---|
year, month |
Calendar month |
lowest_id, highest_id |
Item ID range covered by this file |
count |
Number of items in the file |
dur_fetch_s |
Seconds to fetch from the data source |
dur_commit_s |
Seconds to commit to Hugging Face |
size_bytes |
Parquet file size on disk |
committed_at |
ISO 8601 timestamp of when this month was committed |
Content breakdown
Hacker News has five item types. The vast majority of content is comments, followed by stories (which include Ask HN, Show HN, and regular link submissions). Jobs, polls, and poll options make up a small fraction.
| Type | Count | Share |
|---|---|---|
| comment | 41,616,741 | 87.2% |
| story | 6,079,812 | 12.7% |
| job | 18,094 | 0.0% |
| poll | 2,242 | 0.0% |
| pollopt | 15,463 | 0.0% |
Of all stories submitted to Hacker News, 84.6% link to an external URL. The rest are text-only posts: Ask HN questions, Show HN launches, and other self-posts where the discussion itself is the content.
The average story generates 23.8 comments in its discussion thread. The most-discussed story of all time received 9,275 comments, which gives a sense of how deep conversations can go on particularly controversial or interesting topics.
Story scores
Scores on Hacker News follow a steep power law. Most stories receive only a few points, but a small number break out and reach the front page with hundreds or thousands of upvotes.
| Metric | Value |
|---|---|
| Average score | 1.5 |
| Median score | 0 |
| Highest score ever | 6,015 |
| Stories with 100+ points | 175,933 |
| Stories with 1,000+ points | 2,169 |
The median score of 0 reflects the fact that many stories are submitted but never gain traction. However, the long tail is where things get interesting: over 6,079,812 stories have been submitted, and the top 0.03% (those with 1,000+ points) represent the content that defined conversations across the technology industry.
Most-shared domains
The domains most frequently linked from Hacker News stories tell a clear story about what the community values. GitHub dominates, reflecting HN's deep roots in open source and software development. Major publications like the New York Times and Ars Technica show the community's interest in journalism and long-form analysis.
| # | Domain | Stories |
|---|---|---|
| 1 | github.com | 200,995 |
| 2 | www.youtube.com | 135,566 |
| 3 | medium.com | 124,707 |
| 4 | www.nytimes.com | 77,999 |
| 5 | en.wikipedia.org | 54,655 |
| 6 | techcrunch.com | 54,281 |
| 7 | twitter.com | 50,955 |
| 8 | arstechnica.com | 47,259 |
| 9 | www.theguardian.com | 44,557 |
| 10 | www.bloomberg.com | 37,959 |
Most active story submitters
These are the users who have submitted the most stories over the lifetime of Hacker News. Many of them have been active for over a decade, consistently curating and sharing content with the community.
| # | User | Stories |
|---|---|---|
| 1 | rbanffy | 36,901 |
| 2 | Tomte | 26,259 |
| 3 | tosh | 24,273 |
| 4 | bookofjoe | 20,785 |
| 5 | mooreds | 20,641 |
| 6 | pseudolus | 19,974 |
| 7 | PaulHoule | 19,207 |
| 8 | todsacerdoti | 18,887 |
| 9 | ingve | 17,117 |
| 10 | thunderbong | 16,130 |
| 11 | jonbaer | 14,195 |
| 12 | rntn | 13,410 |
| 13 | doener | 12,971 |
| 14 | Brajeshwar | 12,738 |
| 15 | LinuxBender | 11,058 |
How it works
The pipeline is built in Go and uses DuckDB for Parquet conversion. Historical data is sourced from ClickHouse; live data is fetched directly from the HN Firebase API.
Historical backfill. The pipeline iterates through every month from October 2006 to the most recent complete month. For each month, it queries the ClickHouse source with a time-bounded SQL query, exports the result as a Parquet file sorted by id using DuckDB with Zstandard compression at level 22, and commits it to this repository along with an updated stats.csv and README.md. Months already tracked in stats.csv are skipped, making the process fully resumable.
Live polling. Every 5 minutes, the pipeline calls the HN Firebase API to fetch new items by ID range. Items are grouped into their 5-minute time windows, written as individual Parquet files at today/YYYY/MM/DD/HH/MM.parquet using DuckDB, and committed to Hugging Face immediately. Using the HN API directly means live blocks reflect real-time data with no indexing lag.
Day rollover. At midnight UTC, the entire current month is refetched from the ClickHouse source in a single query and written as an authoritative Parquet file. Today's individual 5-minute blocks are deleted from the repository in the same atomic commit. Refetching instead of merging ensures the monthly file is always complete and deduplicated, regardless of any local state.
Thanks
The data in this dataset comes from the ClickHouse Playground, a free public SQL endpoint maintained by ClickHouse, Inc. that mirrors the official Hacker News Firebase API. ClickHouse uses Hacker News as one of their canonical demo datasets. Without their public endpoint, building and maintaining a complete, regularly updated archive like this would not be practical.
The original content is created by the Hacker News community and is operated by Y Combinator. This is an independent mirror and is not affiliated with or endorsed by Y Combinator or ClickHouse, Inc.
Dataset card for Hacker News - Complete Archive
Dataset summary
This dataset is a complete mirror of the Hacker News archive, sourced from the ClickHouse Playground which itself mirrors the official HN Firebase API. The data covers every item ever posted to the site, from the earliest submissions in October 2006 through today.
The dataset is intended for research, analysis, and training. Common use cases include:
- Language model pretraining and fine-tuning on high-quality technical discussions
- Sentiment and trend analysis across two decades of technology discourse
- Community dynamics research on one of the internet's most influential forums
- Information retrieval benchmarks using real-world questions and answers
- Content recommendation and ranking model development
Dataset structure
Data instances
Here is an example item from the dataset. This is a story submission with a link to an external URL:
{
"id": 1,
"deleted": 0,
"type": 1,
"by": "pg",
"time": "2006-10-09T18:21:51+00:00",
"text": "",
"dead": 0,
"parent": 0,
"poll": 0,
"kids": [15, 234509, 487171],
"url": "http://ycombinator.com",
"score": 57,
"title": "Y Combinator",
"parts": [],
"descendants": 0,
"words": ["y", "combinator"]
}
And here is a comment, showing how discussion threads are connected via the parent field:
{
"id": 15,
"deleted": 0,
"type": 2,
"by": "sama",
"time": "2006-10-09T19:51:01+00:00",
"text": "\"the way to get good software is to find ...",
"dead": 0,
"parent": 1,
"poll": 0,
"kids": [17],
"url": "",
"score": 0,
"title": "",
"parts": [],
"descendants": 0,
"words": []
}
Data fields
Every Parquet file shares the same schema, matching the HN API item format:
| Column | Type | Description |
|---|---|---|
id |
uint32 | Unique item ID, monotonically increasing across the entire site |
deleted |
uint8 | 1 if the item was soft-deleted by its author or by moderators, 0 otherwise |
type |
int8 | Item type as an integer: 1=story, 2=comment, 3=poll, 4=pollopt, 5=job |
by |
string | Username of the author who created this item. Note: by is a reserved word in DuckDB and must be quoted as "by" |
time |
timestamp | When the item was created, in UTC |
text |
string | HTML body text. Used for comments, Ask HN posts, job listings, and polls |
dead |
uint8 | 1 if the item was flagged or killed by moderators, 0 otherwise |
parent |
uint32 | The ID of the parent item. For comments, this points to either a story or another comment |
poll |
uint32 | For poll options (pollopt), the ID of the associated poll |
kids |
list<uint32> | Ordered list of direct child item IDs (typically comments) |
url |
string | The external URL for link stories. Empty for text posts and comments |
score |
int32 | The item's score (upvotes minus downvotes) |
title |
string | Title text for stories, jobs, and polls. Empty for comments |
parts |
list<uint32> | For polls, the list of associated poll option item IDs |
descendants |
int32 | Total number of comments in the entire discussion tree below this item |
words |
list<string> | Tokenized words extracted from the title and text fields |
Data splits
The default configuration includes all historical monthly Parquet files. If you only need today's latest items, use the today configuration which includes only the 5-minute live blocks for the current day.
You can also load individual years or months by specifying data_files:
# Load just January 2024
ds = load_dataset("open-index/hacker-news", data_files="data/2024/2024-01.parquet", split="train")
# Load all of 2024
ds = load_dataset("open-index/hacker-news", data_files="data/2024/*.parquet", split="train")
Dataset creation
Curation rationale
Hacker News is one of the richest sources of technical discussion on the internet, but accessing the full archive programmatically has historically required either scraping the Firebase API item-by-item or working with incomplete third-party dumps. This dataset provides the complete archive in a standard, efficient format that anyone can query without setting up infrastructure.
By publishing on Hugging Face with Parquet files, the data becomes immediately queryable with DuckDB (via hf:// paths), streamable with the datasets library, and downloadable in bulk. The 5-minute live update pipeline means researchers always have access to near-real-time data.
Source data
All data is sourced from the ClickHouse Playground, a public SQL endpoint maintained by ClickHouse that mirrors the official Hacker News Firebase API. The ClickHouse mirror is widely used for analytics demonstrations and contains the complete dataset.
The pipeline queries the ClickHouse endpoint month-by-month, exports each month as a Parquet file using DuckDB with Zstandard compression at level 22, and commits it to this Hugging Face repository. Already-committed months are tracked in stats.csv and skipped on subsequent runs, making the process fully resumable.
Data processing steps
The pipeline runs in three modes:
Historical backfill. Iterates through every month from October 2006 to the most recent complete month. For each month, it runs a SQL query against the ClickHouse source, writes the result as a Parquet file sorted by
id, and commits it to Hugging Face along with an updatedstats.csvandREADME.md.Live polling. After the historical backfill completes, the pipeline polls the HN Firebase API every 5 minutes for new items. It fetches all items with IDs greater than the last committed watermark, groups them into 5-minute time windows by item timestamp, and writes each window as a
today/YYYY/MM/DD/HH/MM.parquetfile committed to Hugging Face immediately. The HN API provides real-time data with no indexing lag.Day rollover. At midnight UTC, the entire current month is refetched from the ClickHouse source in a single query and written as a fresh, authoritative Parquet file. Today's individual 5-minute blocks are deleted from the repository in the same atomic commit. This approach is more reliable than merging local blocks — the result is always complete and deduplicated, sourced directly from the origin.
All Parquet files use Zstandard compression at level 22 and are sorted by id for efficient range scans. No filtering, deduplication, or transformation is applied to the data beyond what the source provides.
Personal and sensitive information
This dataset contains usernames (by field) and user-generated text content (text, title fields) as they appear on the public Hacker News website. No additional PII processing has been applied. The data reflects what is publicly visible on news.ycombinator.com.
If you find content in this dataset that you believe should be removed, please open a discussion on the Community tab.
Considerations for using the data
Social impact
By providing the complete Hacker News archive in an accessible format, we hope to enable research into online community dynamics, technology trends, and the evolution of technical discourse. The dataset can serve as training data for language models that need to understand technical discussions, or as a benchmark for information retrieval and recommendation systems.
Discussion of biases
Hacker News has a well-documented set of community biases. The user base skews heavily toward software engineers, startup founders, and technology enthusiasts based in the United States. Topics related to Silicon Valley, programming languages, startups, and certain political viewpoints tend to receive disproportionate attention and engagement.
The moderation system (flagging, vouching, and moderator intervention) shapes what content survives and what gets killed. Stories and comments that violate community norms are flagged as dead, but this moderation reflects the values of the existing community rather than any objective standard.
We have not applied any additional filtering or quality scoring to the data. All items, including deleted and dead items, are preserved exactly as they appear in the source.
Known limitations
typeis an integer. The item type is stored as a TINYINT enum:1=story,2=comment,3=poll,4=pollopt,5=job. When writing DuckDB queries, useWHERE type = 1for stories rather thanWHERE type = 'story'.byis a reserved keyword in DuckDB. Always quote it with double quotes:"by".deletedanddeadare integers. They are stored as 0/1 rather than booleans.- Comment text is HTML. The
textfield contains raw HTML as stored by HN, not plain text. You may need to strip tags depending on your use case. - Deleted items have sparse fields. When an item is deleted, most fields become empty, but the
idanddeletedflag are preserved. - Scores are point-in-time snapshots. The score reflects the value at the time the ClickHouse mirror last synced, not necessarily the final score.
- No user profiles. This dataset contains items only, not user profiles (karma, bio, etc.).
- Code content is HTML-escaped. Code snippets in comments use HTML entities and
<code>tags rather than Markdown formatting.
Additional information
Licensing
The dataset is released under the Open Data Commons Attribution License (ODC-By) v1.0. The original content is subject to the rights of its respective authors. Hacker News data is provided by Y Combinator.
This is an independent community mirror. It is not affiliated with or endorsed by Y Combinator.
Contact
For questions, feedback, or issues, please open a discussion on the Community tab.
Last updated: 2026-04-18 20:20 UTC
- Downloads last month
- 9