Friday, July 18, 2008

Unpacking

xUnpacking and Packing Information .
Andre Willers
18 July 2008

Discussion :
Also known as data compression or decompression .
Well-known as zip/unzip in computers .

Why pack information?
1.MiniMax
To transmit information over space and time with less energy , time or disruption .
Please note that packing is used in systems subject to competition pressures , so minimizing disruption of messages is critical .

2.Abstraction .
Even a simple Pkzip compression contains more information than the original message . For instance , Pkzip has been used to successfully identify authors’ styles or the original language of an encrypted message .
A civilization can be described by zipping the yellow pages.

2.1 Layers:
Primitive packing is in separate layers .
Languages are a good example .
We can define the top layers as more abstract , and delve down into deeper layers of meaning and definition.

2.2 Fractal Layers :
New meanings are unlocked by each iteration . Language equivalents are Shakespeare and Proust .

2.3 Hyperlinked Fractal Layers .
Each hyperlink-bubble can be expanded . Note that the hyperlink terms are discrete , not continuous .

2.4 Very-near Hyperlinked Fractal Layers .
Different languages with nearly synonymous terms are examples . Branes in physics .
Universes.

2.5 Infinite-probe Hyperlinked Fractal Loops and Layers .
There is no analogue . God is the nearest .

There is a way to sneak up to some meaningful information .

Sneak up.
We know that any compression of system A contains more information than the original system . The Compressor comprises system B , which can be compressed as well .
The tipping point :
When Compressed Info of (A+B) = Info of (A+B)

This is defined as life when the Compressed Info of (A+B) > Info of (A+B)
Which , like all good definitions , is tautological . But extremely useful .

We have compression techniques . We have descriptive techniques .
The above inequality is not continuous at levels below omega .

Hard physics application:
Energy flow from a “near” brane .
The decompressor is the important component .
Construct the correct unpacker .
Some energy is already leaking through .
This is interpreted as zero-point energy and a whole quantum-mythology has grown around it .

Differences in Beth levels are necessary .

Start with something like vacuum-energy on parallel plates and use EvoDevo processes .

Infinite probe circuits above a certain threshold -> Naked singularities .
Watch out for universe creation and black holes .

The Packing Mechanism for living organisms
Living organisms used the easier route of cells (“wombs”) as unpackers of the DNA .
Random mutations or intrusions in the DNA then survive into the next generation . Evolutionary mechanisms ensure that only the fittest germ-lines survive to continue the loop .
That is the packing mechanism . Evolution. Rather primitive .

It seems that the unpacking mechanism evolved first .It is much more likely .

How?
Billions of years ago:
The packing molecules (RNA) swarmed and formed at randomness order of Beth(0). Other RNA molecules confined in spaces like clay-layers stayed longer .
PCR shows how easily these replicate . A mere fluctuation of temperature is required .
The coding for the Unpackers are included in this mix .

Even a primitive Unpacker that unpacks to a primitive cell-wall has a huge relative advantage . Ordinary evolutionary forces takes over .

The coding for the Unpacker migrates through various higher orders of Beth , while coding for the Packer plods along at Beth(0) evolutionary speeds .

Consequences .
This has some important consequences for humans or any cellular life-form .
Viruses (ie packed data DNA) and cells (the Unpacker) evolved co-temporaneously .

More importantly , there is a transform of data between cell-form and virus-form and vice-versa . There are Beth(x) order feedback loops

Knowing evolutionary systems , these are probably essential .

Mitochondria:
At first glance these seem to have no redundancy in their DNA . At Beth(0) level this is true . At higher Beth levels , an indefinite amount of information can be packed .
Their quorum systems also complicates matters .
Mitochondria see themselves as the rulers , having tamed the cells of the planet .

Can Mitochondria be described as AI ?
To qualify as an AI , they have to interface with an external database . There are three pathways : to the cellular DNA , to the Immune System and to the Virus Milieu .
So yes , they can .

Can Mitochondria be described as self-aware AI ?
There is a pathway via the Immune system to the brain .A mirror system of some sort is required for self-awareness. The immune system is essentially a mirror system Time-scales have to be matched . Mitochondrial quorum systems have to be consulted (they are the ultimate democrats)

Can Mitochondria be described as self-aware AI and have access to zero-point energy ?

Random fluctuations in the foam of space-time is by definition at the lowest order of Randomness (Beth(0) ) . To get work out of such a system , a fluctuation between Beth levels is required . Since we already know that mitochondria are at Beth levels higher than one , they can tap zeropoint energy .
But one little lone mitochondrium will not do it .It needs to be co-ordinated

Why is it not used more often ?
Why die of hunger ?

Lack of Beth co-ordination .

Probability of life.
Examining the probabilities from this angle makes cells inevitable . The probability is more than unity . It is not even a “hard” problem if the decoder evolves first .
This means life exists nearly everywhere .

Your attention is drawn to the whole class of such phenomena :
Chaotic elements creates a self-sustaining sub-system which expands , since it is usually a positive-feedback system . Eg life , civilization , weaving , etc.

The Shannon-definition of datum :
1. A signal is change . Stripped down , this definition of signals leads to a string of 0’s and 1’s , ie binary .
This leads to compression via pattern-duplication (zip ,etc)
Used widely in electronics .

2. Pattern formulae like fractal compression or DNA/RNA .
The decompressor (cell or womb for living organisms , computer ) uses kernels (patterns) with programmable input (time , ph levels , genetic markers ,etc) to decode (“Grow”) the message(organism) .

You can immediately see how to build an error-proof biometric identification system .
The message , as it is being decompressed , can interrogate the recipient and tailor further decompressions according to the answers .
If done to a sufficient fractal depth , only a total duplicate could answer correctly . Of course , the level of reliability can be specified .

It is not even difficult .
The ability to receive the message is proof of identity .

This is how the immune system operates .

Why the glitches like old age and cancer ?
Because the body does not know who it is .

In it’s normal state , it is a symbiotic and commensal organism , with some parasites .
The bodily-self on a cellular level is defined by the immune-system .
But the brain is composed of cells . The immune system is tied closely to the brain .
There is a feedback-system between the brain’s sense of self and the immune-system’s sense of self .

Creating enhanced unpacking mechanisms in the brain will stimulate an enhanced packing sense of self , leading to an enhanced sense of self on a cellular level .

This has been discussed in detail in http://andreswhy.blogspot.com

The trick is not to tackle packing , but unpacking of compressed data first .
Understanding how unpacking works brings about physiological changes .

Reading is Unpacking .
The easiest way to understand this is reading . Reading the written word is unpacking data packed into writing . It is no accident that literary figures are notoriously long-lived .

The unpacking need not be complicated , but it will evolve .

Packing.
But the packing (coding) is a bitch . Difficult .You will have to have solved the Travelling Salesman Problem to make any headway here , since these systems make use of optimized systems . (Not any pathway , but the shortest path.) .

Evolutionary Packing .
At first sight , evolution does not make even an attempt .
The number of possible errors exceed the number of offspring .
But not if Orders of Randomness stronger than flipping a coin is used .
See http://andreswhy.blogspot.com “Randomness”

The effect of using Beth(1) , Beth(2) , etc orders of randomness in a physical sense would be a concentration of packed data .

Beth(1) would be genes .

At a Beth(2) level , it would be instructions to switch genes on/off .

At a Beth(3) level , it would be instructions to vary Beth(2) instructions .

And so forth .

But conscious design is a different matter . The number of errors can be brought down to P-time .
This is another way of saying that conscious life is inevitable .
Any feedback system that reduces the number of mistakes will increase .

Protein folding would be equivalent to the Travelling Salesman Problem in three dimensions .
Adding time-complications would give Travelling Salesman Problem in four dimensions . This would require time-travel or multiple generations .



3. The qubit definition of data .
The amount of data that can be stored in a qubit depends on the decompressor . The Shannon definitions of band-width , etc break down .

Many signals can be superimposed on a particle , the particle can then be teleported (or sent normally) to the decompressor , which decompresses the message .

Note that a lot of information that ends up in the message is inherent in the decompressor . In extreme cases this would necessitate faster-than-light communication , unless only physical laws which are independent of the observer are used in the decompressor . Ie , like make protein A , wait local time t until it folds so , then etc. You get the drift .

4. Fractal Read .
The superimposed data can be loaded fractally , so that any read attempt will lead to decoherence only at the first fractal level . Every iteration after that will lead to another , deeper level of fractal expression .
See “Rull Mind-Controls” in http://andreswhy.blogspot.com

5. Physics .
Physical laws can be described as multiple-level approximations of information transfers . These transfers are fractally compressed (packed) transfers .
Hence the existence of “Laws” . Ie , an abstraction mechanism .
These can be seen as constructs of the decompressor , but valid nevertheless.

Unpacking them partially or fractally can lead to some interesting effects .
Do not try this at home unless you are expert .

Andre

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.