Hacker News new | past | comments | ask | show | jobs | submit
One of my favorite quotes: “There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies.”

I think about this a lot because it’s true of any complex system or argument, not just software.

This is indeed a great quote (one of many gems from Sir Tony) but I think the context that follows it is also an essential insight:

> The first method is far more difficult. It demands the same skill, devotion, insight, and even inspiration as the discovery of the simple physical laws which underlie the complex phenomena of nature. It also requires a willingness to accept objectives which are limited by physical, logical, and technological constraints, and to accept a compromise when conflicting objectives cannot be met. No committee will ever do this until it is too late.

(All from his Turing Award lecture, "The Emperor's Old Clothes": https://www.labouseur.com/projects/codeReckon/papers/The-Emp...)

"At first I hoped that such a technically unsound project would collapse but I soon realized it was doomed to success. Almost anything in software can be implemented, sold, and even used given enough determination. There is nothing a mere scientist can say that will stand against the flood of a hundred million dollars. But there is one quality that cannot be purchased in this way-- and that is reliability. The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay."

This explain quite a lot actually!

Very poignant, thank you. I can see my absolute core principle - KISS reflected in this. I still struggle to find a single use in my career where it wouldn't be the best approach, especially long term.
From the linked lecture, which I printed out to read as part of a new less is more screen time management regime (where I print out longer form writing for reading) I found this very interesting tidbit in the context of Tony having made a delivery miscalculation and his team failing to deliver on one of their products; which is where I think a lot people are today with LLMs:

"Each of my managers explained carefully his own theory of what had gone wrong and all the theories were different. At last, there breezed into my office the most senior manager of all, a general manager of our parent company, Andrew St. Johnston. I was surprised that he had even heard of me.

"You know what went wrong?" he shouted--he always shouted -- "You let your programmers do things which you yourself do not understand." I stared in astonishment. "

"No committee will ever do this until it is too late."

The software I like best was not written by "teams"

I prefer small programs written by individuals that generally violate memes like "software is never finished" and "all software has bugs"

(End user perspective, not a developer)

One of my biggest accomplishments was shipping a suite of 5 apps from four divisions where three of them resented each other’s existence and seemed bound and determined to build rules in the system that made sure the other two couldn’t function. Which made no goddamn sense because it was a pipeline and you can’t get anything out one end if it gets jammed in the middle.

I was brought in to finish building the interchange format. The previous guy was not up to snuff. The architect I worked for was (with love) a sarcastic bastard who eventually abdicated about 2 rings of the circus to me. He basically took some of the high level meetings and tapped in when one of us thought I might strangle someone.

Their initial impression was that I was a prize to be fought over like a child in a divorce. But the guy who gives you your data has you by the balls, if he is smart enough to realize it, so it went my way nine times out of ten. It was a lot of work threading that needle, (I’ve never changed the semantics of a library so hard without changing the syntax), but it worked out for everyone. By the time we were done the way things worked vs the way they each wanted it to work was on the order of twenty lines of code on their end, which I essentially spoonfed them so they didn’t have a lot of standing to complain. And our three teams always delivered within 15% of estimates, which was about half of anyone else’s error bar so we lowly accreted responsibilities.

I ended up as principal on that project (during a hiring/promotional freeze on that title. I felt bad for leaving within a year because someone pulled strings for that, but I stayed until I was sure the house wouldn’t burn down after I left, and I didn’t have to do that). I must have said, “compromise means nobody gets their way.” About twenty times in or between meetings.

loading story #47327877
Also, this software is free. Generally the authors were not paid to write it
It's the committee vs the dictator issue - a small driven individual (or group) can achieve a lot, but they can also turn into tyrants.

A committee forms when there's widespread disagreement on goals or priorities - representing stakeholders who can't agree. The cost is slower decisions and compromise solutions. The benefit is avoiding tyranny of a single vision that ignores real needs.

loading story #47336675
We are poorer for him having waited to drop that sentence at his Turing Award acceptance speech. I use it all the time.

Tony might be my favorite computer scientist.

It seems that with vibe coding our industry has finally, permanently embraced the latter approach. RIP Tony.
> permanently

don't bet on it

Can’t argue with the quote. However my current boss has been pushing this to the extreme without much respect for real-world complexities (or perhaps I’m too obtuse to think of a simple solution for all our problems), which regrettably gives me a bit of pause when hearing this quote.
Reminds me of another good one: Make everything as simple as possible, but not simpler. (-- probably not Einstein)
aged very well
Reminds me of this Pascal quote: "I would have written a shorter letter, but I did not have the time."

https://www.npr.org/sections/13.7/2014/02/03/270680304/this-...

"Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away."

Antoine de Saint-Exupéry

loading story #47334042
loading story #47330395
Reminds me of this quote... “A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work.”
One of the policies of The Rhinoceros Party in Canada was to increase the complexity of the taxation system so much that nobody could find the loopholes to exploit.
Had to look them up (WP), wasn't disappointed. We have the Monster Raving Loony Party in the UK.

One of the Rhino's Party policies stands out - are you sure Trump wasn't born a Cannuck and was stolen at birth by racoons and smuggled down south?

"Annexing the United States, which would take its place as the third territory in Canada's backyard (after the Yukon and the Northwest Territories—Nunavut did not yet exist), in order to eliminate foreign control of Canada's natural resources"

loading story #47335152
Good thing we now have technology that allows us to crank out complex software at rates never-before seen.
Complex software full of very obvious deficiencies that nobody bothered to look for.
It can also be used to simplify existing code bases.
As Dijkstra was preparing for his end of life, organizing his documents and correspondence became an important task. Cancer had snuck up on him and there was not much time.

One senior professor, who was helping out with this, asked Dijkstra what is to be done with his correspondences. The professor, quite renowned himself, relates a story where Dijsktra tells him from his hospital bed, to keep the ones with "Tony" and throw the rest.

The professor adds with a dry wit, that his own correspondence with Dijsktra were in the pile too.

What is the equivalent of correspondence today?

I guess back then each letter had a cost, in (delivery) time and money, so you better make it count.

My guess is that these correspondences were often interesting to read because they had to be worthwile to send because of the associated cost.

John Backus had some correspondence with Dijkstra that's worth a read: https://medium.com/@acidflask/this-guys-arrogance-takes-your...
Incredible letters, thanks for sharing. I wish some of this correspondence was published in physical books. What a joy it would be to read.
There's that immortal Alan Kay line "arrogance in computer science is measured in nano Dijkstras".
loading story #47326002
loading story #47326159
That's a wild ride of passive aggressive academia in a field I know something about. A rare treat. Thanks for sharing!
Fun story - at Oxford they like to name buildings after important people. Dr Hoare was nominated to have a house named after him. This presented the university with a dilemma of having a literal `Hoare house` (pronounced whore).

I can't remember what Oxford did to resolve this, but I think they settled on `C.A.R. Hoare Residence`.

There's the Tony Hoare Room [1] in the Robert Hooke Building. We held our Reinforcement Learning reading group there.

[1] https://www.cs.ox.ac.uk/people/jennifer.watson/tonyhoare.htm...

>our Reinforcement Learning reading group there //

Anyone else, like me, imagining ML models embodied as Androids attending what amounts to a book club? (I can't quite shake the image of them being little CodeBullets with CRT monitors for heads either.)

loading story #47335275
I had countless lectures and classes there
Our Graphics Lab at University used to be in an old house opposite a fish and chip shop. The people at the fish and chip shop were suspicious of our lab as all they saw was young men (mostly) entering and leaving at all hours of the night. We really missed an opportunity to name it "Hoare House" after one of our favourite computer scientists.
I was awarded the CAR Hoare prize from university, which is marginally better than the hoare prize I suppose
Shame the university takes itself so seriously. The illustrative example of overloading would have been pertinent to his subject of expertise.
loading story #47325932
loading story #47326854
Imagine being a world-famous computer scientist and dying and one of the top threads in a discussion of your life is juvenile crap about how your name sounds like "whore".
loading story #47335283
loading story #47327464
He was the professor in the Programming Research Group (known universally as the PRG) at Oxford when I was doing my DPhil and interviewed me for the DPhil. I spent quite a bit of time with him and, of course, spent a lot of time doing stuff with CSP including my entire DPhil.

Sad to think that the TonyHoare process has reached STOP.

RIP.

I think I and most people had hoped that he would DIV instead.
I lucked in to meeting him once, in Cambridge. A gentle intellectual giant.

I repeatedly borrow this quote from his 1980 Turing Award speech, 'The Emperor's Old Clothes'... "At last, there breezed into my office the most senior manager of all, a general manager of our parent company, Andrew St. Johnston. I was surprised that he had even heard of me. "You know what went wrong?" he shouted--he always shouted-- "You let your programmers do things which you yourself do not understand." I stared in astonishment. He was obviously out of touch with present day realities. How could one person ever understand the whole of a modern software product like the Elliott 503 Mark II software system? I realized later that he was absolutely right; he had diagnosed the true cause of the problem and he had planted the seed of its later solution."

My interpretation is that whether shifting from delegation to programmers, or to compilers, or to LLMs, the invariant is that we will always have to understand the consequences of our choices, or suffer the consequences.

> I realized later that he was absolutely right

It would have been fun if he’d directly said “You’re absolutely right!”

"Around Easter 1961, a course on ALGOL 60 was offered in Brighton, England, with Peter Naur, Edsger W. Dijkstra, and Peter Landin as tutors. I attended this course with my colleague in the language project, Jill Pym, our divisional Technical Manager, Roger Cook, and our Sales Manager, Paul King. It was there that I first learned about recursive procedures and saw how to program the sorting method which I had earlier found such difficulty in explaining. It was there that I wrote the procedure, immodestly named Quicksort, on which my career as a computer scientist is founded. Due credit must be paid to the genius of the designers of ALGOL 60 who included recursion in their language and enabled me to describe my invention so elegantly to the world. I have regarded it as the highest goal of programming language design to enable good ideas to be elegantly expressed." - C.A.R Hoare, The Emperor's Old Clothes, Comm. ACM 24(2), 75-83 (February 1981).
Tony's An Axiomatic Basis for Computer Programming[1] is the first academic paper that I read that I was able to understand when I was an undergrad. I think it unlocked something in me because before that I never believed that I would be able to read and understand scientific papers.

That was 35ish years ago. I just pulled up the paper now and I can't read the notation anymore... This might be something that I try applying an AI to. Get it to walk me through a paper paragraph-by-paragraph until I get back up to speed.

[1]:https://dl.acm.org/doi/10.1145/363235.363259

Followup on the above with these two classics;

Retrospective: An Axiomatic Basis For Computer Programming. This was written 30 years after An Axiomatic Basis for Computer Programming to take stock on what was proven right and what was proven wrong - https://cacm.acm.org/opinion/retrospective-an-axiomatic-basi...

How Did Software Get So Reliable Without Proof? More detailed paper on the above theme (pdf) - https://6826.csail.mit.edu/2020/papers/noproof.pdf

Thanks for the recommendation. I downloaded both social.pdf and noproof.pdf on my Kindle Scribe to read them carefully and revisited the discussions on EWD638 and EWD692.

It is very interesting to see how Sir Tony diverged from EDW: one is right in theoretical sense but cynical about human fallacies and how the society is heading towards more wasteful complexity, one is to live with it and stay optimistic.

There is a proverb in Chinese Taoism:

小隱隱於野,大隱隱於市

A small recluse hides in the wild, while a great recluse hides in the city

loading story #47336946
I can recommend NotebookLM [1] for reading through scientific papers. You can then ask it questions and even get a podcast generated.

1. https://notebooklm.google/

loading story #47331564
One of the greats. Invented quicksort and concurrent sequential processes. I always looked up to him because he also seemed very humble.
He also invented many other things, like enumeration types, optional types, constructors. He popularized the "unions" introduced by McCarthy, which were later implemented in ALGOL 68, from where a crippled form of them was added to the C language.

Several keywords used in many programming languages come from Hoare, who either coined them himself, or he took them from another source, but all later programming language designers took them from Hoare. For example "case", but here only the keyword comes from Hoare, because a better form of the "case" statement had been proposed first by McCarthy many years earlier, under the name "select".

Another example is "class" which Simula 67, then all object-oriented languages took from Hoare, However, in this case the keyword has not been used first by Hoare, because he took "class", together with "record", from COBOL.

Another keyword popularized by Hoare is "new" (which Hoare took from Wirth, but everybody else took from Hoare), later used by many languages, including C++. At Hoare, the counterpart of "new" was "destroy", hence the name "destructor", used first in C++.

The paper "Record Handling", published by C.A.R. Hoare in 1965-11 was a major influence on many programming languages. It determined significant changes in the IBM PL/I programming language, including the introduction of pointers . It also was the source of many features of the SIMULA 67 and ALGOL 68 languages, from where they spread in many later programming languages.

The programming language "Occam" has been designed mainly as an implementation of the ideas described by Hoare in the "Communicating Sequential Processes" paper published in 1978-08. OpenMP also inherits many of those concepts, and some of them are also in CUDA.

And, of course, the Go programming language.
loading story #47326020
And regretful inventor of the null reference!

His “billion dollar mistake”:

https://www.infoq.com/presentations/Null-References-The-Bill...

The mistake was not null references per se. The mistake was having all references be implicitly nullable.

He states around minute 25 the solution to the problem is to explicitly represent null in the type system, so nullable pointers are explicitly declared as such. But it can be complex to ensure that non-nullable references are always initialized to a non-null value, which is why he chose the easy solution to just let every reference be nullable.

loading story #47327432
The null reference was invented by Hoare as a means to implement optional types, which works regardless of their binary representation.

Optional types were a very valuable invention and the fact that null values have been handled incorrectly in many programming languages or environments is not Hoare's fault.

loading story #47326124
I'm pretty sure that this is not true. I talked to Bud Lawson (the inventor of the pointer) and he claimed that they had implemented special behaviour for null pointers earlier. When I talked to Tony later about it, he said he had never heard of Bud Lawson. So probably both invented them independently, but Bud came first.
loading story #47326617
loading story #47327340
Talking about Quicksort, John Bentley’s deep dive in Quicksort is quite illuminating. https://m.youtube.com/watch?v=QvgYAQzg1z8
oh man, google tech talks. what a throwback.

there was a time, 10-15 years ago, when they were super cool. at some point they """diluted""" the technicality content and the nature of guests and they vanished into irrelevance.

loading story #47331957
Yes, but don't forget his formal work also (Hoare logic).
To me, this is his most important contribution; Everybody else built on top of this.

Hoare Logic - https://en.wikipedia.org/wiki/Hoare_logic

loading story #47325637
They were never concurrent, they were communicating. https://en.wikipedia.org/wiki/Communicating_sequential_proce...
That is indeed the correct title, but the processes were concurrent.

However, they were not just concurrent, but also communicating.

I remember attending a tech event at MSR Cambridge, and a speaker made some disparaging comment about older developers not being able to keep up in this modern world of programming.

An older gentleman stood up and politely mentioned they knew a thing or two.

That was Tony Hoare.

I've had the good fortune to attend two of his lectures in person. Each time, he effortlessly derived provably correct code from the conditions of the problem and made it seem all too easy. 10 minutes after leaving the lecture, my thought was "Wait, how did he do it again?".

RIP Sir Tony.

The confusion is possibly almost appropriate, given so much of his work was on creating systems which avoid confusion through using proper synchronized communication channels. The null pointer stuff is famous, but it's occam and the Communicating Sequential Processes work that were brilliant. Maybe it's also brilliantly wrong, as I think Actor model people could argue, but it is brilliant.

My favourite quote of his is “There are two ways of constructing a piece of software: One is to make it so simple that there are obviously no errors, and the other is to make it so complicated that there are no obvious errors.”

While we hope it's not true, if it is a very deserved RIP.

CSP and Hoare logic were brilliant. He was a huge proponent of formal methods.

He famously gave up on making formal methods mainstream, but I believe there will be a comeback quite soon.

On generated code, verification is the bottleneck. He was right, just too early.

(btw the "confusion" here was confusion about whether he had actually died. this comment was originally posted to https://news.ycombinator.com/item?id=47316880, which was the thread we merged hither)
And here we are throwing all that brilliance away with Async abominations. Software can be so simple and elegant.
Actor model would also be brilliantly wrong: it doesn't compose smaller correct systems into larger correct systems.

(Software) Transactional Memory and other ideas inspired by databases have a much better shot at this.

loading story #47329193
Damn.

Tony Hoare was on my bucket list of people I wanted to meet before I or they die. My grad school advisor always talked of him extremely highly, and while I cannot seem to confirm it, I believe Hoare might have been his PhD advisor.

It's hard to overstate how important Hoare was. CSP and Hoare Logic and UTP are all basically entire fields in their own right. It makes me sad he's gone.

You can always check his entry on the Mathematics Genealogy Project: https://mathgenealogy.org/id.php?id=45760
loading story #47318026
When I met him unfortunately I didn't realize how important he was (1987). The place where I worked used formal methods to verify the design of an FPU, in collaboration with the PRG. iirc the project was a success. I never heard of formal methods being successfully used again until TLA+ a few years ago.
loading story #47317708
loading story #47317685
Tony advised me to make money with the software model checker I have been writing. In contrast to the typical practice to make these tools open source and free for use. Would have loved to learn more from him. He was a great teacher but also a great and sharp listener. Still remember the detour we made on the way to a bar in London, talking too much and deep about refinement relations. RiP.
I was introduced to him pretty late in the game, in this interview where he and Joe Armstrong and Carl Hewitt talked concurrency! It was interesting hearing them discuss their different thoughts and approaches

https://www.youtube.com/watch?v=37wFVVVZlVU

How sad that we all missed when Carl Hewitt submitted that video to HN: https://news.ycombinator.com/item?id=19202209.

I had no idea that a discussion between the three of them even existed.

loading story #47334286
He came to give a lecture at UT Austin, where I did my undergrad. I had a chance to ask him a question: "what's the story behind inventing QuickSort?". He said something simple, like "first I thought of MergeSort, and then I thought of QuickSort" - as if it were just natural thought. He came across as a kind and humble person. Glad to have met one of the greats of the field!
Happy to meet you. I was there and I remember that question being asked. I think it was 2010.

If I remember correctly he had two immediate ideas, his first was bubble sort, the second turned out to be quicksort.

He was already very frail by then. Yet clarity of mind was undiminished. What came across in that talk, in addition to his technical material, was his humor and warmth.

That's right - it was bubble sort first. Absolutely - frail, yet sharp. I'm happy to hear several of us didn't forget this encounter with him.
I remember this vividly! I believe he said that he thought of _Bubble Sort_ first, but that it was too slow, so he came up with QuickSort next
Good to hear from you after a while, Gaurav (I think?!).
He discusses this and his sixpence wager here: https://youtu.be/pJgKYn0lcno

(Source: TFA)

Haha I was there too. I remember he made thinking clearly seem so simple. What a humble man.

If I remember correctly, his talk was about how the world of science-the pure pursuit of truth-and the world of engineering-the practical application of solutions under constraints-had to learn from each other.

loading story #47330677
From his Oxford bio: "To assist in efficient look-up of words in a dictionary, he discovered the well-known sorting algorithm Quicksort."

I always liked this presentation. I think it's equally fine to say "invented" something, but I think this fits into his ethos (from what I understand of him.) There are natural phenomena, and it just takes noticing.

A near neighbour described being interviewed by Tony Hoare for his first job after graduating (he got the job!). Sounds like the interview process in those days was a chat over lunch rather than coding exercises. https://news.ycombinator.com/item?id=43592201
Sad that his (and many others') dream of widespread formal verification of software never came true. He made really fundamental contributions to computer science but will probably be mostly known for quicksort and the quote about his "billion dollar mistake", not his decades-long program to make formal methods more tractable.

Makes me think of an anecdote where Dijkstra said that he feared he would only be remembered for his shortest path algorithm.

Almost all of the earliest cited works on concurrency management in software were authored by C A R 'Tony' Hoare.

I genuinely forget he authored quicksort on the regular.

Actually, thanks to AI, this may change soon! we may be in a place where widespread formal verification is finally possible.
Random anecdote and Mr. Hoare (yep not a Dr.) has always been one of my computing heroes.

Mr. Hoare did a talk back during my undergrad and for some reason despite totally checked out of school I attended, and it is one of my formative experiences. AFAICR it was about proving program correctness.

After it finished during the Q&A segment, one student asked him about his opinions about the famous Brooks essay No Silver Bullet and Mr. Hoare's answer was... total confusion. Apparently he had not heard of the concept at all! It could be a lost in translation thing but I don't think so since I remember understanding the phrase "silver bullet" which did not make any sense to me. And now Mr. Hoare and Dr. Brooks are two of my all time computing heroes.

"Sir", not "Mr." if you're going to be pedantic about titles ;)

Edit: Oh and he has multiple honorary doctorates (at least 6!), so would be just as much "Dr." too!

loading story #47326896
loading story #47326478
Sometimes I feel completely separated to mainstream culture.

That the death of Sir Tony Hoare has been completely ignored by mainstream news is one of them.

Not a peep anywhere for a man that put a big dent in reality.

Mainstream news have not been presenting much actual relevant content beyond reporting on wars.
Sir Tony Hoare visited Institute for System Programming in Moscow and gave a lecture quarter of the century ago. It was unforgettable experience to see the living legend of your field. He was a senior person then already and today I am going to celebrate his long and wonderful life.
I first came across Tony Hoare about 24 years ago while learning C from The C Programming Language by Kernighan and Richie. I knew him only as C. A. R. Hoare for a long time. When I got on the Internet, it took me a while to realise that when people said Tony Hoare, it was the same person I knew as C. A. R. Hoare. Quoting the relevant text from the book:

> Another good example of recursion is quicksort, a sorting algorithm developed by C.A.R. Hoare in 1962. Given an array, one element is chosen and the others partitioned in two subsets - those less than the partition element and those greater than or equal to it. The same process is then applied recursively to the two subsets. When a subset has fewer than two elements, it doesn't need any sorting; this stops the recursion.

> Our version of quicksort is not the fastest possible, but it's one of the simplest. We use the middle element of each subarray for partitioning. [...]

It was one of the first few 'serious' algorithms I learnt to implement on my own. More generally, the book had a profound impact on my life. It made me fall in love with computer programming and ultimately choose it as my career. Thanks to K&R, Tony Hoare and the many other giants on whose shoulders I stand.

I wrote both my master thesis and PhD on Hoare's Communicating Sequential Processes. I really enjoyed it's simplicity, expandability, and was always amazed that it inspired and influenced language constructs in Go, Erlang, occam and the likes.
TIL his first publication was in Russian, published in the USSR where he spent a year as an exchange student. I wonder if Igor Mel’čuk [0] remembers him.

[0] - https://olst.ling.umontreal.ca/static/melcuk/

Rest in peace, he hasn't seen the industry change.

"A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980 language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."

-- C.A.R Hoare's "The 1980 ACM Turing Award Lecture"

Leslie Lamport hosted a chat with him a few years back: https://www.youtube.com/watch?v=wQbFkAkThGk. They spoke about general CS stuff and some aspects of concurrency.
RIP.

His presentation on his billion dollar mistake is something I still regularly share as a fervent believer that using null is an anti-pattern in _most_ cases. https://www.infoq.com/presentations/Null-References-The-Bill...

That said, his contributions greatly outweigh this 'mistake'.

You misunderstand the “billion dollar mistake”. The mistake is not the use of nulls per se, the mistake is type-systems which does not make them explicit.
Anti patterns are great, they act as escape hatches or pressure release valves. Every piece of mechanical equipment has some analogue for good reason.

Without things like null pointers, goto, globals, unsafe modes in modern safe(r) languages you can get yourself into a corner by over designing everything, often leading to complex unmaintainable code.

With judicious use of these anti-patterns you get mostly good/clean design with one or two well documented exceptions.

loading story #47326698
loading story #47326852
unless its greatly exagerated - he was quite mind sharp in his 80s

SIR_TONY_HOARE = μX • (think → create → give → X)

-- process ran from 1934 to 2026 -- terminated with SKIP -- no deadlock detected -- all assertions satisfied -- trace: ⟨ quicksort, hoare_logic, csp, monitors, -- dining_philosophers, knighthood, turing_award, -- billion_dollar_apology, structured_programming, -- unifying_theories, ... ⟩ -- trace length: ∞ The channel is closed. The process has terminated. The algebra endures.

I saw a casual lecture given by Tony Hoare as a teenager. The atmosphere was warm and welcoming, even if I didn't fully understand all of the content. I remember he was very kind and answered my simple questions politely.
From the article:

> On the topic of films, I wanted to follow up with Tony a quote that I have seen online attributed to him about Hollywood portrayal of geniuses, often especially in relation to Good Will Hunting. A typical example is: "Hollywood's idea of genius is Good Will Hunting: someone who can solve any problem instantly. In reality, geniuses struggle with a single problem for years". Tony agreed with the idea that cinema often misrepresents how ability in abstract fields such as mathematics is learned over countless hours of thought, rather than - as the movies like to make out - imparted, unexplained, to people of 'genius'. However, he was unsure where exactly he had said this or how/why it had gotten onto the internet, and he agreed that online quotes on the subject, attributed to him, may well be erroneous.

Somewhat off-topic, but it's cool hearing this from someone who's contributed so much to the fields of programming and mathematics. It makes me hopeful that my own strugglings with math will pay out over time!

Here is my favorite visualization of quicksort, by a group of Hungarian dancers:

https://www.youtube.com/watch?v=3San3uKKHgg

Thank you for this!
One of Billy Crystal's later standup bits was talking about how his parents have hit an age where their favorite game with their friends is called, "Guess Who Died". I've been thinking about that bit an awful lot the last couple of years.
"Communicating Sequential Processes" by Tony Hoare: https://www.cs.cmu.edu/~crary/819-f09/Hoare78.pdf

It had intrigued me due to its promise of designing lock-free concurrent systems, that can (I think) also be proven to be deadlock-free.

You do this by building a simple concurrent block that is proven to work correctly, and then build bigger ones using the smaller, proven blocks, to create more complex systems.

The way it is designed is processes don't share data and don't have locks. They use synchronized IPC for passing and modifying data. It seemed to be a foundational piece for designing reliable systems that incorporate concurrency in them.

BTW Rob Pike designed the Go language channels inspired by this work: https://go.dev/tour/concurrency/2
Tony Hoare on how he came up with Quicksort:

he read the algol 60 report (Naur, McCarthy, Perlis, …)

and that described "recursion"

=> aaah!

https://www.youtube.com/watch?v=pJgKYn0lcno

I had the privilege to attend his "Billion Dollar Mistake" talk in person at QCon London 2009. Little did I know that this talk would go down in software development history! What an honor to have witnessed this live!

https://www.infoq.com/presentations/Null-References-The-Bill...

“I never had a doctorate, so I had to make do with Quicksort.” —Sir Tony Hoare (unpublished interview for Algorithms to Live By)
With respect I say that the one can only feel gobsmacked about how much complexity has grown.

In the 60s inventing one single algorithm with 10 lines of code was a thing.

If you did that today nobody would bat an eye.

Today people write game engines, compilers, languages, whole OS and nobody bats an eye cause there are thousands of those.

Quick sort isn't even a thing for leet code interviews anymore because it's not hard enough.

Rest in peace to a real one, we've lost one of the brightest minds of our century
{"deleted":true,"id":47324456,"parent":47324054,"time":1773155944,"type":"comment"}
{"deleted":true,"id":47324455,"parent":47324054,"time":1773155938,"type":"comment"}
I watched this video a few months ago.

Virtual HLF 2020 – Scientific Dialogue: Sir C. Antony R. Hoare/Leslie Lamport

https://www.youtube.com/watch?v=wQbFkAkThGk

One of the most important papers of all time.

Just one word: Quicksort.

One-of-a-kind genius.

This is devastating news.

When I started university he gave a talk to all the new CompScis which as you can imagine was incredibly inspirational for an aspiring Software Engineer.

Grateful to have had that experience.

RIP

Just two days ago I was curious about the PhD advisor of my PhD advisor and so on, and discovered that I am actually an academic great-grandson of Hoare (shame on me, I should have realised that earlier), and joked, "Wow, they are all still alive!". Then I saw the news yesterday on HN.
"The null reference was my billion dollar mistake responsible for innumerable errors, vulnerabilities and system crashes" (paraphrasing). I don't know. This design choice exposed the developer to system realities, and modern language approaches are based on decades of attempts to improve on it, and they are not necessarily better. Safer yes, but more weighty.

Can anyone suggest a better approach for a situation like this in the future? What's better than exposing addressing the problem with a light solution?

"The problem isn't the concept of 'null', but rather that everything can be null, which makes it impossible to distinguish between the cases where null is an appropriate and expected value, from the cases where null is a defect."

https://blog.ploeh.dk/2015/04/13/less-is-more-language-featu...

Which system reality? Plenty of architectures don't have a concept of a null pointer at the hardware level. Other architectures provide multiple address spaces, or segmented memory addressing. Even when a null pointer exists at the hardware level, it doesn't have to be the zero address.

Null pointers are a software abstraction, and nowadays we have better abstractions.

Some kind of an optional/variant type, enforced by the type system.
loading story #47317324
1) ACM published this book in 2021; Theories of Programming: The Life and Works of Tony Hoare - https://dl.acm.org/doi/book/10.1145/3477355

See the "preface" for details of the book - https://dl.acm.org/doi/10.1145/3477355.3477356

Review of the above book - https://www.researchgate.net/publication/365933441_Review_on...

Somebody needs to contact ACM and have them make the above book freely available now; there can be no better epitaph.

2) Tony Hoare's lecture in honour of Edsger Dijkstra (2010); What can we learn from Edsger W. Dijkstra? - https://www.cs.utexas.edu/~EWD/DijkstraMemorialLectures/Tony...

Somebody needs to now write a similar one for Hoare.

Truly one of the absolute greats in the history of Computer Science.

A collection of Tony Hoare's memorable quotes - https://en.wikiquote.org/wiki/C._A._R._Hoare
Hoare's papers were some of my favorite. Rest in peace
Assert early, assert often!
Absolutely the GOAT of concurrency. May his ring never die.
Incredibly sad news. His contributions to the foundations of computing will remain relevant for generations to come.
Hoare logic: https://en.wikipedia.org/wiki/Hoare_logic

> The central feature of Hoare logic is the Hoare triple. A triple describes how the execution of a piece of code changes the state of the computation. A Hoare triple is of the form {P}C{Q} where P and Q. are assertions and C is a command.

> P is named the precondition and Q the postcondition: when the precondition is met, executing the command establishes the postcondition. Assertions are formulae in predicate logic.

> Hoare logic provides axioms and inference rules for all the constructs of a simple imperative programming language.

[...]

> Partial and total correctness

> Using standard Hoare logic, only partial correctness can be proven. Total correctness additionally requires termination, which can be proven separately or with an extended version of the While rule

His paper on communicating processes was a great read when I was new to computer science research.
I was lucky enough to see Sir Tony Hoare speak at EuroPython in 2009. This was his last slide:

One Day

- Software will be the most reliable component of every product which contains it.

- Software engineering will be the most dependable of all engineering professions.

-Because of the successful interplay of research:

  - into the science of programming;

  - and the engineering of software.

Come on people we have a lot of work to do.
He was a professor at my old alma mater, Queen's University of Belfast. I remember hearing a story about him going to Harvard to give a lecture and, as he was presented, one of their professors referred to himself as the "Hoare of Harvard"
wow, but lived a full life
I didn't know him, but every time I've read one of his papers I've learned something and changed my viewpoint. I'm never worried that my effort will be betrayed reading Hoare (Especially rare in a field that moves so fast like applied CS)
Rest in peace, Sir Tony Hoare
always knew him as C.A.R. Hoare, takes me way back to freshman college years

RIP good sir

RIP Sir Tony Hoare

Turing Award Legend.

SIR_TONY_HOARE = μX • (think → create → give → X)

-- process ran from 1934 to 2026 -- terminated with SKIP -- no deadlock detected -- all assertions satisfied -- trace: ⟨ quicksort, hoare_logic, csp, monitors, -- dining_philosophers, knighthood, turing_award, -- billion_dollar_apology, structured_programming, -- unifying_theories, ... ⟩ -- trace length: ∞ The channel is closed. The process has terminated. The algebra endures.

I am greatly saddened by the passing of Tony Hoare. His work has affected me deeply; in personal, academic, and professional life. Without visionaries like him, I would not find the love in computer science as I do now. It would be a great honor to have accomplished a fraction of what he did. My condolences to his close friends and family.
Rest in peace.
One of the greatest figure of computing in history and an example of humility as a human.

Thank you for your work on ALGOL, you were multiple decade ahead of your time.

Rest in peace.

How many jobs were had or not due to the candidates ability to implement his algorithms?
As a junior dev, I loved to ask interview candidates to implement merge sort or quick sort on whiteboards.

As a non-junior dev I realize how stupid that was.

loading story #47326269
Needs a black bar!
Finally. The black bar is there.
{"dead":true,"deleted":true,"id":47326251,"parent":47324416,"time":1773163585,"type":"comment"}
The part about 'genius' being slow and about wrestling with difficult problems resonates.

The idea of 'genius' or in fact 'intelligence' being about speed isn't just a Hollywood thing though; it's also been a Silicon Valley thing as well; it's why most big tech interviews are time-constrained.

Over the years, I've also heard many tech CEOs say stuff alongside "There's only one type of intelligence" or "All intelligent people are intelligent in the same way."

These kinds of statements raised my eyebrows but now with LLMs being able to solve most puzzle problems rapidly but struggling with complex problems it's completely obvious that it's not the case.

What it says is frightening. The CEOs of big companies have been giving positions to people who have the same thinking style as them. Quick puzzle-solving tech tests are literal discrimination against the neurodivergent and also against geniuses. They've been embracing wordcels and rejecting shape rotators.

I think a guy like Tony Hoare would struggle to find a job these days.

You could argue that the issue extends beyond Hollywood and Silicon Valley... The whole education system is centered around puzzle-solving speed. It's hilarious that AI is now solving all these tests within minutes with better scores than humans. Crazy to think that LLMs could graduate from university based on current assessment policies! It's very revealing of what kind of education system we have.

this is black bar grade great. give us black bar
https://news.ycombinator.com/item?id=47316880

249 points by nextos 16 hours ago | 61 comments

Thanks! We'll merge those comments hither.
Never made it as a wise man I couldn't cut it as a poor man stealing Tired of livin' like a blind man I'm sick of sight without a sense of feeling And this is how you remind me

This is how you remind me of what I really am This is how you remind me of what I really am

It's not like you to say sorry I was waitin' on a different story This time I'm mistaken For handing you a heart worth breakin' And I've been wrong, I've been down Been to the bottom of every bottle These five words in my head Scream, "Are we havin' fun yet?"

Yet, yet, yet, no, no Yet, yet, yet, no, no

It's not like you didn't know that I said, "I love you," and I swear I still do And it must have been so bad 'Cause livin' with me must have damn near killed you

And this is how you remind me of what I really am This is how you remind me of what I really am

It's not like you to say sorry I was waitin' on a different story This time I'm mistaken For handing you a heart worth breakin' And I've been wrong, I've been down Been to the bottom of every bottle These five words in my head Scream, "Are we havin' fun yet?"

Yet, yet, yet, no, no Yet, yet, yet, no, no Yet, yet, yet, no, no Yet, yet, yet, no, no

Never made it as a wise man I couldn't cut it as a poor man stealin' And this is how you remind me This is how you remind me

This is how you remind me of what I really am This is how you remind me of what I really am

It's not like you to say sorry I was waitin' on a different story This time I'm mistaken For handing you a heart worth breakin' And I've been wrong, I've been down Been to the bottom of every bottle These five words in my head Scream, "Are we havin' fun yet?"

Yet, yet, are we havin' fun yet? Yet, yet, are we havin' fun yet? Yeah, yeah (These five words in my head scream) Are we havin' fun yet? Yeah, yeah (These five words in my head) No, no

I uploaded lecture to Claude and asked to create skill using principles described. I guess we shall see if AI can actually follow them. :)