fanf: (dotat)
[personal profile] fanf
"Bad programmers worry about the code. Good programmers worry about data structures and their relationships." - Linus Torvalds

"If you've chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming." - Rob Pike

"Show me your flowcharts and conceal your tables, and I shall continue to be mystified. Show me your tables, and I won’t usually need your flowcharts; they’ll be obvious." - Fred Brooks

Date: 2014-07-14 13:45 (UTC)
gerald_duck: (duck and computer)
From: [personal profile] gerald_duck
Sadly, I feel those admirable principles are becoming gradually less applicable.

While that's the situation if a single competent computer scientist has been near the code and data, these days it's all too common to find one programmer doing an O(n) search on data another has given them in lexical order. Or, worse, to do an O(log n) search on unsorted data where the test corpus just happened to be sorted because of its provenance.

Revealing the data structure and making explicit the invariants which apply goes a lot further towards making the algorithm obvious.

But then there are a great many people who touch code and data these days who simply don't think at all in terms of DSA. I suppose that just means that when you see the data isn't structured you can tell there's no algorithm to speak of, but still…

Date: 2014-07-15 08:05 (UTC)
ext_8103: (Default)
From: [identity profile] ewx.livejournal.com
Perhaps worth noting that a form of this holds true down at the micro-optimization level, when you start worrying about locality &c. I don't know if there are any pithy quotes expressing this to be found l-)

Date: 2014-07-15 14:05 (UTC)
simont: A picture of me in 2016 (Default)
From: [personal profile] simont
It occurs to me that compiler optimisation is a prominent exception to this rule. (I don't really want to say 'counterexample', since it doesn't invalidate the sensibleness of the rule in many other contexts.)

In a compiler mid-end, carefully chosen data structure representation can only take you so far. Certainly you can do things like using a flowgraph representation rather than linear code with explicit branches and labels, and using virtual rather than physical regs for as long as possible, and doing SSA if that seems like a good idea, and so on; but sooner or later you still end up with a long list of instructions to do a bunch of things in a particular order and you have to ask yourself, 'What other list of instructions can I prove to be semantically equivalent to this one and judge to have better performance and/or code size?', and there are lots and lots of different techniques for finding such lists and very few of them are immediately obvious given the input data structure.

Date: 2014-07-16 09:46 (UTC)
simont: A picture of me in 2016 (Default)
From: [personal profile] simont
Easier, certainly, but still a long way from 'obvious given the data representation'! But I suppose if you're prepared to weaken the rule to say something closer to 'algorithms are easier if you pick the right data structure first' in place of 'algorithms are trivial and/or self-evident given the right data structure', then yes.

July 2025

S M T W T F S
  1 2345
6789101112
13141516171819
20212223242526
2728293031  

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated 2025-07-20 08:15
Powered by Dreamwidth Studios