A Dialog (between friends) on The Law of the Conservation of Computation

A Dialog (between friends) on The Law of the Conservation of Computation

Russell [8:16 AM] 
This is happening to programs and programming too. http://www.worksonbecoming.com/thoughts-prefaces/2015/10/1/this-is-contingency

Some Works
This is contingency
Remarks on the contingency of new forms and the phenemenon of replication.

Schoeller [8:16 AM] 
Very NKS.

Schoeller [8:17 AM]
I’m somewhat less certain of this outcome than you — it relies heavily on everyone playing nice and working with each other.

Russell [8:18 AM] 
That's just you.

Schoeller [8:18 AM] 
Which is challenging — witness the web API boom/bust of 5 years ago.

Russell [8:18 AM] 
The arc of assimilation is clear

Schoeller [8:18 AM] 
It’s the pragmatism/skepticism in me.

Russell [8:19 AM] 
Most humans almost 6.997 billion of them have no idea about computers

Schoeller [8:19 AM] 
I get it. But the pace of progress can be furiously slow in the face of economics.

Schoeller [8:19 AM]
For instance — where’s my flying car?

Schoeller [8:20 AM]
We’re not going to have networked 3D printed robots manufacturing things for some time.

Russell [8:20 AM] 
That's not progress

Russell [8:20 AM]
Flying cars aren't selected for

Russell [8:20 AM]
They lack survivability value

Russell [8:21 AM]
Amazons prime deliver moving to Amazon flex...  As they push delivery times to zero one must manufacture close to the source

Russell [8:21 AM]
Of the transaction

Russell [8:21 AM]
It's happening

Russell [8:21 AM]
Who needs to fly except the drones

Schoeller [8:21 AM] 
I understand the vision. I’m just not convinced it’ll happen...

Schoeller [8:22 AM]
Well me for one :simple_smile:

Schoeller [8:22 AM]
Drone delivery is another unlikely occurance in any large scale.

Schoeller [8:23 AM]
The economics/logistics just don’t make any sense. Packages are heavy.

Russell [8:23 AM] 
Personal drivers. Personal shoppers. Personal virtual assistants.  ... All are shaping the world to not need all this movement.  Once were three degrees removed from these activities we won't care if it's a machine doing it all.

Russell [8:24 AM]
So make people want less heavy stuff

Russell [8:24 AM]
Sell them a kindle and ebooks

Russell [8:24 AM]
:)

Schoeller [8:24 AM] 
Agree on that.

Russell [8:24 AM] 
It's happening.

Schoeller [8:24 AM] 
Although (sidebar) dead-tree’s not dead.

Schoeller [8:25 AM]
You can’t digitize the tactile feel of thumbing through the pages of a book.

Schoeller [8:25 AM]
I suspect it’ll become boutique. Soft-cover trade books are done. But hardcover, well-bound, limited edition will carry on and do quite well.

Russell [8:27 AM] 
Nice try

Schoeller [8:27 AM] 
Back on track — A lot of this future stuff is the same: the hyperloop is just the next space elevator which was the next flying car, etc.

Russell [8:27 AM] 
You can destroy people's ability to touch

Russell [8:27 AM]
Negative sir

Schoeller [8:27 AM] 
I like my fingers, thank you very  much :wink:

Russell [8:27 AM] 
I'm making a much bigger systematic argument

Russell [8:28 AM]
Don't care about the specific forms

Russell [8:28 AM]
Only that forms get selected and replicated

Schoeller [8:28 AM] 
Well, it has to be grounded in something.

Russell [8:28 AM] 
Replicability!

Russell [8:28 AM]
Is it computationally efficient!

Russell [8:29 AM]
Boom boyeeee

Schoeller [8:29 AM] 
Much of the problem of flying cars, drone delivery, space elevators, 3d printed manufacturing, and hyperloops is the connection from physics -> economics.

Schoeller [8:29 AM]
We don’t have that with software. There, the challenge is the rate and format of the bits flying around.

Russell [8:30 AM] 
Hence computationally efficient

Russell [8:30 AM]
Economic networks also replicate computational efficiency.

Russell [8:31 AM]
Commodities have stable ish values because the idea is computationally efficient. Utility etc is well established in the network.  So they are exchanged etc.

Schoeller [8:32 AM] 
You’re asserting, then, that competition == computational efficiency?

Russell [8:32 AM] 
Correct

Russell [8:32 AM]
Efficiency must have survivability.

Russell [8:32 AM]
The trivial would not be efficient for economies

Schoeller [8:33 AM] 
I can buy that. At least in the sense of efficiency from the perspective of the system as a whole. Not for any given agent participating in the system.

Russell [8:33 AM] 
Yes.

Schoeller [8:33 AM] 
The agents are horrifically inefficient.

Schoeller [8:33 AM]
(individually)

Russell [8:34 AM] 
Hard to separate them from the system

Schoeller [8:34 AM] 
True, unless you’re an agent.

Russell [8:34 AM] 
I believe there is a law of the conservation of computation.

Schoeller [8:35 AM] 
computation can neither be created nor destroyed, but can only change form?

Russell [8:35 AM] 
Correct

Russell [8:36 AM]
And that results in all other conservation laws

Russell [8:36 AM]
And is why competition in all networks is computational efficient

Russell [8:36 AM]
And cannot be any other way

Schoeller [8:36 AM] 
It’ll take a bit for me to wrap my head around that idea.

Russell [8:37 AM] 
The singularity is pure probability.  Computationally irreducible.

Russell [8:37 AM]
Once probability breaks down into four forces and matter and light etc. we have pattern

Russell [8:37 AM]
But by the law of the conservation of computation it can't go to all pattern.

Russell [8:37 AM]
Or that would reduce computation

Russell [8:38 AM]
So competition between networks must proceed.

Russell [8:40 AM]
And per my blog post the idea that replication normalizes nodes in the network as they become more fully normalized the network of replication starts to collide with other networks of replication where the normalizations selected started competing.  Until a new form and new networks begin the process again.

Russell [8:40 AM]
Computation merely moves around these networks as the process of complexification and simplification double back over and over.

Russell [8:40 AM]
Even any american company is an example

Russell [8:41 AM]
We are simplifying and normalizing them all the time.

Russell [8:41 AM]
Employees replicate basic skills

Russell [8:41 AM]
And we recruit for these skills

Russell [8:41 AM]
 revenue lines get simplified

Russell [8:41 AM]
marketing simplifies messages to the world

Russell [8:42 AM]
All for survivability.

Russell [8:42 AM]
But this is also exposes companies to competition

Russell [8:42 AM]
It gets easier to poach employees.  And to see ideas and strategies on the outside.

Russell [8:42 AM]
Soon it tips and companies need New products. New marketing. New employees.

Russell [8:43 AM]
All the while computation is preserved in the wider network

Schoeller [8:43 AM] 
Where I’m struggling is how this copes with the notion that the universe tends toward disorder.

Russell [8:44 AM] 
Normalized forms become dispensable as individual nodes.

Russell [8:44 AM]
Disorder is pure noise.

Schoeller [8:44 AM] 
Order in the universe is effectively random.

Russell [8:44 AM] 
Total entropy.

Russell [8:45 AM]
Which if every network normalizes towards highly replicated forms they have less internal competition.  They have heat death.

Russell [8:45 AM]
Which is total entropy.

Russell [8:45 AM]
Again. A singularity is pure probability.

Russell [8:46 AM]
No pattern.

Russell [8:46 AM]
Randomness.

Schoeller [8:46 AM] 
I can buy that. Certainly there’s a low probability that any agent will succeed, thus the entropy tends to increase.

Russell [8:46 AM] 
Fully replicated forms are those that maximize survivability.

Russell [8:46 AM]
So some super weird platonic object between order and chaos

Russell [8:46 AM]
Between infinities.

Russell [8:46 AM]
A circle for example is a weird object

Russell [8:47 AM]
Rule 110 is a weird object

Schoeller [8:47 AM] 
Here’s a question — where does the computation come from to achieve fully replicated forms?

Schoeller [8:48 AM]
Presumably there’s some notion of “potential” computation?

Russell [8:48 AM] 
Negative.

Russell [8:48 AM]
There's only computation

Russell [8:48 AM]
Potential is a relational concept

Schoeller [8:49 AM] 
Hmm… then back to my question.

Russell [8:49 AM] 
There is no potential time

Russell [8:49 AM]
There is no potential dimension

Russell [8:50 AM]
There is no potential temperature

Schoeller [8:50 AM] 
Right, but time only moves forward — there’s no notion of conservation of time.

Russell [8:50 AM] 
Ah!

Russell [8:50 AM]
But I'm suggesting there is

Russell [8:50 AM]
Time is computation

Schoeller [8:50 AM] 
Actually, there is potential temperature. Temperature == energy.

Russell [8:51 AM] 
Yes it gets rather semantic

Schoeller [8:51 AM] 
The whole field is “thermodynamics"

Russell [8:51 AM] 
Yes which is superseded by computation

Russell [8:51 AM]
Hence why info theory and thermodynamics are isomorphic

Russell [8:51 AM]
They are just substrate discussions

Russell [8:51 AM]
Which go away in the math

Schoeller [8:52 AM] 
Well, strictly speaking that math doesn’t govern, but attempt to describe.

Russell [8:53 AM] 
Look at how computer science handlse time

Russell [8:53 AM]
Steps or cycles

Russell [8:53 AM]
It defines time as compute steps

Russell [8:53 AM]
Hahahahaha

Schoeller [8:53 AM] 
If info theory and thermo are isomorphic, then the principal of potential has to translate in some way. It’s important because that’s one of the foundations of conservation of energy.

Russell [8:54 AM] 
Yes yes

Russell [8:54 AM]
I'll find a translation for you

Russell [8:54 AM]
It's got something to do with chaitins number

Schoeller [8:55 AM] 
Computer science handles time as a long from a particular, arbitrary point. And calculates differences as a byproduct of the way it operates.

Schoeller [8:55 AM]
A “quantum” computer would handle time very differently.

Russell [8:56 AM] 
Yes. Keep going.

Schoeller [8:56 AM] 
“We” calculate time from celestial positions.

Schoeller [8:56 AM]
None of that relates to the more generalized notion of time.

Russell [8:57 AM] 
I propose the translation of time fits within the law of conservation of computation

Russell [8:57 AM]
Quantum computers are closer to singularities. Computing with pure probabilities

Russell [8:57 AM]
Classical computers compute with approximated machine precision probabilities

Russell [8:58 AM]
Somewhere things get super weird with math (algebra and geometry meets probability theory)

Russell [8:58 AM]
Math itself suffers same challenge

Schoeller [8:59 AM] 
Yes, well math likes to be very precise.

Russell [8:59 AM] 
That which symbolically lacks pure probability humans and classical computers can handle

Russell [9:00 AM]
Once you deal with infinities and infinistimals you start getting to pure probabilities and math theory starts bleeding.

Schoeller [9:00 AM] 
Okay, so I can accept a notion of a conservation of probability of time.

 

Russell [9:00 AM] 
N-order logics require n+1 order and incompleteness and set paradoxes.

Russell [9:01 AM]
Math itself becomes computationally weird.

Schoeller [9:01 AM] 
ie that the probably of an event occurring or not occurring within a system is 1. Of course, that’s tautological.

Schoeller [9:02 AM]
But also that it would hold for any number of events over any set of times.

Russell [9:02 AM] 
Because once a math system becomes computationally inefficient it all of a sudden is  incomplete. And we reduce to "somethings are true but we can't prove them in this system"

Russell [9:03 AM]
Yes pure probability is binary.  Either everything happens or nothing happens.

Russell [9:03 AM]
If everything happens you must conserve computation as that everything happens

Russell [9:03 AM]
Can't be more than 1! Can't be less than 1!

Schoeller [9:04 AM] 
Well, I think what I’m saying is that my need for “potential” computation is solved by probability.

Russell [9:04 AM] 
And local events of everything take on less than all computation because of the halting problem.

Schoeller [9:04 AM] 
Although I haven’t completely convinced myself.

Russell [9:04 AM] 
If the halting problem weren't true every event / computation could self inspect and computation would tend to 0

Russell [9:05 AM]
Chaitins number is a measure of probability

Russell [9:05 AM]
Complexity is a measure of probability

Russell [9:05 AM]
Probability is a notion of unknown information

Russell [9:05 AM]
All data of everything would contain every program and all outputs

Russell [9:06 AM]
And has a probability of any and all events total of 1.  All information is known

Russell [9:06 AM]
And the same time it is 0

Schoeller [9:06 AM] 
Here wouldn’t the truth of the halting problem arise from the fact the system is influenced from elements outside the system?

Russell [9:06 AM] 
Because all information is computationally irreducible of the maximal kind

Schoeller [9:06 AM] 
(ie. similar to thermo)

Schoeller [9:06 AM]
Therefore a computation can never know its inputs.

Russell [9:06 AM] 
Yes.  Halting problem is exactly that

Russell [9:06 AM]
Unknowns

Schoeller [9:06 AM] 
And thus, can never know its outputs.

Schoeller [9:07 AM]
Because the program can’t see beyond itself.

Russell [9:07 AM] 
It's not a matter of inputs

Russell [9:07 AM]
It emerges from computation!

Russell [9:07 AM]
Elementary ca show this

Russell [9:07 AM]
Godel showed this

Russell [9:08 AM]
Mere DESCRIPTION!  Description is computation

Russell [9:09 AM]
I think wolfram gave in too easily

Russell [9:09 AM]
He still believes in Euclidean time

Russell [9:09 AM]
Or whatever Greek time

Schoeller [9:10 AM] 
Right. And if computation is probabilistic, the program couldn’t even know, necessarily, what it was actually computing at any given point (until that point occurrs).

Schoeller [9:11 AM]
Yeah, I think your theory only works if time is a probability not a discrete measure.

Russell [9:12 AM]
Time isn't discrete.

Russell [9:12 AM]
It's pure difference

Schoeller [9:12 AM] 
Which is really to say that the outcome of a computation can’t be known until the state of the system is known.

Schoeller [9:12 AM]
Which itself can’t be known with any certainty until it occurs.

Schoeller [9:13 AM]
Or, it’s all wibbly, wobbly, timey, wimey stuff.

Schoeller [9:14 AM]
Or, possibly the Heisenberg uncertainty principal as applied to computation.

Russell [9:14 AM] 
But 2+2 is 4

Schoeller [9:14 AM] 
Only if the state of the system is consistent.

Schoeller [9:14 AM]
(which it happens to be)

Russell [9:15 AM] 
And that math statement is a "localized" statement

Schoeller [9:15 AM] 
So, the probably of 2+2=4 is very, very close to 1, but not exactly. Possibly so close that its limit approaches.

Schoeller [9:16 AM]
Right. So, part of why the state for 2+2=4 is consistent is because we’ve defined it that way.

Russell [9:16 AM] 
It's what I call robust

Russell [9:16 AM]
In most universes 2+2 is 4

Russell [9:16 AM]
In the multiverse there are universes where that's not true

Schoeller [9:16 AM] 
But, if you shift from say cartesian to spherical, it doesn’t necessarily hold unless you change what “2” and “4” mean.

Russell [9:17 AM] 
But those are very small universes that reduce quickly

Russell [9:17 AM]
Yes.

Russell [9:17 AM]
Thank you!

Schoeller [9:17 AM] 
i.e their definition is relative to the system you’re computing within.

Russell [9:17 AM] 
Counting and the math emerges from the computational systems

Russell [9:17 AM]
Yes.

Russell [9:18 AM]
And in the entirety of the multiverse all maths exist.  All description exists.

Schoeller [9:19 AM] 
Sure. That’s as tautological as the probability that something either exists or does not is 1.

Schoeller [9:20 AM]
Since the probability of anything existing within an infinity, unbounded system would also be 1.

Russell [9:20 AM] 
And your point?

Russell [9:21 AM]
Math loves tautologies

Russell [9:21 AM]
We have to state them all the time

Russell [9:21 AM]
Or reduce to them

Schoeller [9:22 AM] 
Well, it’s consistent with probability theory. So, that’s nice.

Russell [9:22 AM] 
Is that what symbolics and rule replacements are?

Russell [9:23 AM]
One giant computational tautology

Schoeller [9:23 AM] 
If you’re going to have a theory that talks about local behvior within systems, you have to have consistency when you take that to its extreme limit — such as when the system contains everything possible.

Schoeller [9:24 AM]
Aren’t you just describing the state of the system with symbolics and rules?

Russell [9:25 AM] 
Sure.

Russell [9:25 AM]
And the state of everything is what?

Schoeller [9:25 AM] 
Here describe means “govern” (unlike my earlier math statement)

Russell [9:26 AM] 
Isn't that the state of all sub states or local states?

Russell [9:26 AM]
Of which some local states are meta descriptions of sub sub states or neighboring states

Schoeller [9:26 AM] 
I think the state of everything is that the probability of anything is 1.

Schoeller [9:26 AM]
It’s rather useless, but so is the notion of the state of everything.

Russell [9:27 AM] 
Govern gets tricky because it's non sensible as a fundamental concept. Eg the spin of a quark doesn't govern. It's just a property.

Russell [9:27 AM]
Gravity and the other forces don't govern.

Russell [9:28 AM]
They are descriptions of relationships

Schoeller [9:28 AM] 
Sure, but the definition of “2” on a Cartesian plane is.

Russell [9:28 AM] 
If Gravity is merely space time curvature. A geometry that doesn't mean it governs.

Russell [9:28 AM]
What is the definition of 2 governing?

Schoeller [9:29 AM] 
It’s governing the behavior of 2 within the cartesian system.

Russell [9:29 AM] 
It's merely a description of relations between an X position and a y position on a description of a plane

Schoeller [9:29 AM] 
i.e. that 2 can’t be 3 or an apple.

Russell [9:30 AM] 
Ah.  Yes.  Definition bounds localized networks.

Russell [9:30 AM]
2 is a 3 in some systems

Russell [9:31 AM]
Say a simple system of primes and non primes without concern of actual quantity

Schoeller [9:31 AM] 
I think this idea holds. The symbols and rules govern the system in a computational sense. But that does not mean that the system itself governs any physical phenomena. Only that it describes (to the extent that the rules reasonably describe the same.)

Schoeller [9:31 AM]
— moving back to describe and govern meaning different things --

Russell [9:31 AM] 
Yes Im in agreement

Russell [9:31 AM]
Govern is a localized concept of bounding relations

Russell [9:32 AM]
Let's return to the main q in all this

Russell [9:32 AM]
WHAT DOES THE WORK OF COMPUTATION

Schoeller [9:32 AM] 
Yes, bounding relations that define a specific system within the multiverse of possible systems.

Schoeller [9:35 AM]
Well, the computation would have to be done within the medium of the system, right?

Schoeller [9:36 AM]
It can’t be just one thing. Because we’ve already enumerated that there a quantum computers that are different than regular computers that are different than the human brain.

Russell [9:36 AM] 
yeah, i haven't figured this out.

Russell [9:36 AM]
other than, it's everything i'm trying to figure out.

Schoeller [9:37 AM] 
And to some degree, you pick the computational medium when you define the system. At least in the programming world. Mathematica vs Java vs Spark.

Russell [9:38 AM] 
i think it's this.... or related.... to perceive/observe/describe/explain at all, whatever sub network of everything (whatever universe, computer, entity, person, rock...) IS.  and the IS and IS NOT of breaking out of total relation to everything is COMPUTATION.  and it's a super weird notion.  but the mere simplification of total relation to partial relation IS the COMPUTATIONAL ACT.

Schoeller [9:39 AM] 
And with a math problem, you’re defining the computational medium to be the human brain.

Russell [9:40 AM] 
well, within the human / this universe frame of reference or partial relation to everything, yes.

Schoeller [9:41 AM] 
Agree that it’s a weird notion that computational singularity doesn’t “seem” to underly everything. But the rules and computation have to be related and even dependent.

Russell [9:41 AM] 
whether we can COMPUTE  or "IS" with a different substrate... well, i think so.... i think "computers" and "virtual reality" are moving our COMPUTE/DESCRIPTION/RELATION to everything beyond/outside the Human Brain.

Schoeller [9:42 AM] 
So, it’s easier if we constrain ourselves to the systems we make up.

Schoeller [9:43 AM]
As for what computes the physical world — maybe there’s a lesson in evolution theory, where “computation” is quite literally random mutations of the medium itself.

Schoeller [9:44 AM]
And where the “selection”/“survival”/“success” of the computation occurs outside the system (back to the halting problem discussion above)

Schoeller [9:46 AM]
I should clarify "But the rules and computation have to be related and even dependent.” … within a system. In the multiverse, anything goes. :simple_smile:

Russell [9:48 AM] 
yes, on your evolutionary theory... or something similar to that.  the resolution of probabilities IS computation.   resolution being like the resolution of super positions in quantum stuff.

Russell [9:48 AM]
i believe that basically happens as you move from logic systems, computational systems, i.e. russell's theory of types etc.

Russell [9:49 AM]
related to all this numbo jumbo: http://plato.stanford.edu/entries/quine-nf/

Schoeller [9:50 AM] 
It’s an example of a chaotic system where order appears to arise naturally, so it seems like it’d be a reasonable starting place to think about other physical systems.

Russell [9:50 AM] 
yes, i say we conclude there for now

Schoeller [9:50 AM] 
I think the key is the halting problem bit — that the computation can’t possibly know if its successful. That occurs outside the system where the computation is valid. It only blindly executes.

Russell [9:51 AM] 
we've created something between chaos and order in this dialog

Russell [9:51 AM]
which will be non trivial to clear up.