summaryrefslogtreecommitdiff
path: root/deleuze.txt
blob: 2a57d0d10fabaabb908a30e2421983d745d90519 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
epistemic extension into the outside world
sequence of speculative information-harvesting gambles
creating knowvelty


deleuzian assemblages -- i call 'em design patterns


determining knowledge about the past
and
determining knowledge about the future
are the same thing
but only one of them is experienced by us
as "causing" (the future)
the other as "discovering" (the past)
both of them constitute
  controlling the flow of information
  with respect to a local point of spacetime





Deleuze:

https://plato.stanford.edu/entries/deleuze/#DiffRepe


Deleuzian concept of the virtual seems to refer to only the virtual
future.

Reterritorialization is counterfeit information about intentions with
respect to energy (i.e. reducible to our axiomatic foundations)


Differentials:
LOOP OR NO LOOP (chemical chain reaction event loop)
PERMEATE OR NO (cell membrane)
COPY IDENTIY (DNA or RNA replication; protein recognition by thymus in primary tolerance; protein recognition by leukocytes either in secondary tolerance or as foreign bodies; recognition of )



Deleuze by identifying the virtual as identical (if
"counter-effectuated") with a realization seems to be throwing out the
principle of difference??

Anyway nothing can realize an imagination; there can only be experience
that lacks surprise because of prior imagination.

(the remembered image and/or the brain structure programming left by
it e.g. structure in the salience network to trigger memory of the
imagination experience)

The prior imagination can be identified as the internal cause of the
suppressed emotional reaction of surprise.  The image should not be
_identified_ with the external event itself.





Sense and nonsense:  Communications are programs input to the remote
computer; i.e., they are proofs.

The program is "run" on the local computer brain, and the program
performs "load" operations based on emotionally-tagged memories, its
learned/acquired conceptual categories, etc., which allow meanings to
go across in very complex ways at times (e.g. in poetry) that aren't
any kind of closed set that could be defined by a grammar. A single
statement within a grammar can define a new more complicated grammar
for all subsequent statements (indeed, a communication atom necessarily
always defines a new more complicated double-encoding for all subsequent
communications, depending on whether it was received). You can layer on
more and more meanings because of the generality of the computer brain
allowed to transmit programs to other computer brains.

A property of proofs (perhaps important to the evolution of language)
is that validating proofs is separate from, and computationally cheaper
than, generating proofs.


When the proofs cannot be validated
because they contain references that cannot be resolved (undefined
terms)

(in terms of locally-defined primitives, e.g. axioms) produce an error
response that can be called recognition of denotational nonsense.








For Deleuze, the task of art is to produce “signs” that will push us out of our habits of perception into the conditions of creation. When we perceive via the re-cognition of the properties of substances, we see with a stale eye pre-loaded with clichés; we order the world in what Deleuze calls “representation.” In this regard, Deleuze cites Francis Bacon: we’re after an artwork that produces an effect on the nervous system, not on the brain.


Instead of art what we need is SILENCE to enable COMPRESSION AND
RE-PROCESSING but isn't that what Deleuze said?  CIRCUIT BREAKERS SO WE
CAN EVADE CONTROL.






A university is an Erlang-style message passing system for academic
knowledge accumulation's cultural life-system to regenerate itself.



Key point for Deleuze is that the "counter-effectuation" is actually
real-life really-physical Max Ent physics rather than quantum physics
analogy/woo.  Bayesian statistical knowledge deriving from information
theory.


Deleuze didn't understand quantum physics correctly but it turns out
that it doesn't matter because quantum physics doesn't have anything to
do with metaphysics.  It's only that Uncertainty forces human beings
to adopt a de-centralizing de-totalizing Copernican mental shift.  But
it doesn't even do it in the way that is most relevant to metaphysics.
There is also the de-centralizing de-totalizing Copernican mental
shift of INTUITIONIST MATHEMATICS.

Back to physics: Deleuze understood the main point: that particles
are merely virtual constructs while these "interaction events" are
the actual reality available to advanced physics -- the particles are
virtual constructs that exist only in the human 3D mental model which
is definitely NOT the same as the physical universe -- this is one of
those places where we see the difference -- but the physical universe in
making individual particles places where information access is limited
fundamentally because the boundary between one particle and another with
which it interacts isn't so much illusory as the only real thing, while
the non-boundary is illusory.

Quantum physics DOES imply a macro universe where macro assemblies
of particles also have limited access to information; but the actual
universe we see has EVEN MORE limitations on access to information,
they are much much stricter than Uncertainty, and therefore we see much
less information embedded in physical objects than Uncertainty allows
in its theoretical maximum.  (Physics experiments can be set up so
that information is not lost; but life in general is always balancing
loss of information against energy expenditure.)  Max Ent physics and
Bayesian statistics are mathematical/physical approaches to calculating
the information available at a given spacetime location.  However, part
of the nature of quantum uncertainty AND max ent physics is that, from
WITHIN the system, the limitations apply to the observer and the limits
are self-referential in the sense that the limitations that apply to an
observer's disability to have information from other spacetime points
can include the disability to know which information is available!
I.e., the theory produces known unknowns.  The fact that there are
spacetime points in the universe where knowledge of mathematics does
not exist or exists at a merely undergrad level, means also unknown
unknowns.





Deleuzian metaphysics
attempts to describe
construction of a neural network
from the inside

Mr Deleuze talks about phase state changes
And social collisions as quantum events
But never describes the individual's phase
state as subject to change as a result
of internal computation resulting in lossless
or lossy compression of the neural structure;
nor does he talk about the salience network.

Deleuze is "wrong" about the quantum particle information but
correct about the historical information.  The thing about "quantum
woo" is that quantum uncertainty is the only familiar model of the
physical/theoretical limitations of knowledge, and it is used to
illustrate other limitations of knowledge when the nature of
the limitation is not even related.

Heisenberg's Quantum Uncertainty is only one of many physical
limitations on what knowledge is available in the universe (and where,
when, etc).  (It may be the most counterintuitive, since it implies that
spacetime is fundamentally not like our human, vision-based mental model
of it).

Replacing Heisenberg with MaxEnt fixes a lot of philosophical or
non-scientific misuse and mentally clarifies the nature of information
flow through the universe.  Also, MaxEnt converges to Bayesian
statistical reasoning and there seems to be some kind of convergence
with ethical ideas there.





Tue Oct 31 10:09:22 AM EDT 2023

Either a structure exists in a brain or it does not.

A structure in the brain that does exist can be equivalent up to
isomorphism with many structures outside the brain.

The ability of the individual to recognize a pattern relies on both
the pattern existing (brain structure subject to transformations under
morphism) and the brain's secondary (e.g., salience network) structures
correlating the structure with some perception or basis of comparison.

Deleuze talks about difference as if a person could compare a previous
experience to a current experience; this is only a subjective illusion.
The previous experience always alters the network through which the
future experience flows, but the previous network configuration becomes
permanently unavailable ("past") and disconnected in every subsequent
flow.  The illusion occurs when the individual has already experienced
both events, and then experiences remembering them by comparing two
memories.  These two memories will surely be stored in the structure
using some redundancy.  However, the brain cannot literally compare the
structure before to the structure after; this is an illusion. The brain
constructs a new memory of the before, incorporating information that
occurred later, when it is erasing the old memory of the before. It
is a potentially lossy compression mechanism, but also allows the
brain to employ an idempotent processing strategy with respect to the
ordering of life events necessary to construct a life strategy adapted
to the immediate environment.  Humans are _adapted to adapt_ to novel
environments, not only individually but socially.





Tue Oct 31 11:16:09 AM EDT 2023

The process by which computer systems socially evolved into
internet-based specially-centralized distributed computation should
serve as a model for understanding other evolutionary transitions toward
distributed computation such as the evolution of sociality in humans and
of the immune system and its mechanisms of tolerance and adaptation.

The immune response can be seen as normalizing with respect to the
binding energy of the antibody-producing leukocytes. There is a
biological mechanism to supply food energy differentially according to
binding strength. There is also a biological mechanism to control the
rate of mutation (i.e., of originality) within these leukocytes as they
produce mutated child leukocytes; this corresponds to academics who
read Deleuze and then try to use Difference and Repetition to encourage
creativity in children's art, etc.  Did Deleuze discuss the controlled
introduction of mutation (originality, Chomskyian generators) into
normalizing systems?

The actual generators must be "compressed" structures on which
computation is performed without "decompression."  Feeding noise
into compressed structures and then decompressing them results in
the generation of random but structured information.  The results
are filtered in the frontal lobe in a way that is analogous to the
filtering function of Thymus to produce primary tolerance.  Socially
language filtering may be primarily a mechanism to prevent linguistic
self-destruction; people with certain brain conditions reveal a
socially-unfiltered generator, while others reveal a generator
unfiltered even by connection to reality. I posit that there is
some kind of dream or unconscious process that filters language for
self-destruction and that this can block the compression of the
structure because the mechanisms of compression (and integration with
the rest of the brain) produce emotional responses that then stop
the process; the integration process simply crashes because of an
emotional overload.  Certain thoughts cannot occur, therefore there
is no possibility to send or receive communications about them; all
such communications must be coded, implied, or produced implicitly by
unrelated structures and the more accurate or precise these become,
they closer the mind's pattern recognition will come to finding and
executuing the brain structure that causes the crash. I believe this
to be an evolved mechanism and part of the social computation machine;
the surrounding society can make thoughts unthinkable using the
hormonal/emotional voice/facial/gestural signalling system that allows
multiple brains to be integrated into a single distributed computation.
For this reason, it is not directly analogous to the immune system mechanism
of primary tolerance as described above.

However, the immune system's mechanism of primary tolerance actually
_is_ a distributed system which incorporates its own behavioral control
loop connection to the local brain -- specifically the sense of smell,
which is used to assess histocompatibility of mates -- the human social
distributed hormonal network computation uses pheremonal sampling of
individual humans in order to produce social barriers that prevent
disease spread.  _This_ is the biological mechanism analogous to the
social filter of generated creativity.  The system is based more on
controlling the inputs to the generator than the generation.

Oregon.  There's more again.

In the sense of smell example we also have attraction and repulsion
as basic forces.  This occurs in the brain filter as well; there are
multiple emotional reactions to every generated possibility, and the
brain will use an emotional gestalt to choose when and whether to
activate some possibility.  When the reaction is a total absence of
positive emotion, the generated content will be discarded.  Social
systems similarly have multiple "buckets" in which to put each person;
not only spaces (such that a person can only have one) -- the individual
person can be "pigeonholed" multiple times, adopting multiple roles
-- but the buckets themselves as social constructs -- are they not
equivalent to the emotions as biological constructs?

Did Deleuze put this in there or what?


Tue Oct 31 11:59:50 AM EDT 2023
CENTRAL LIE
The central lie of narrative fiction is the conclusory ending.  Why
is there a conclusory ending?  Because the computational process of
computing the narrative must end (or else continue).  When it finishes,
the audience feels the task completion in reality but projects it into
the imaginary of the story. This creates the danger of such projection
onto the individual's own life; either in total, or in its various
compartmentalized elements (e.g., a relationship, a social event).

Task completion is a frontal lobe event.  The frontal lobe recognizes
the generation of a stop code of some kind (analogous to the stop codes
of DNA but also analogous to a process exiting according to its internal
logic, rather than being terminated by an exogenous signal).

The computational process that is open, a continuous non-terminating
generator, unless very specially selected, is almost sure to be boring.
A closed (terminating) generator is interesting in proportion to its
length.  Life itself is a closed (terminating) generator of chemical
chain reactions, and human beings are one of its longer chain reactions.
Studying this chain reaction is biology and is interesting because the
subject is finite (permitting the conceptualization of task completion
as a future event thus flowing energy into present events predicted to
make the desired event more likely later).

Human culture is an open (non-terminating) generator and is filled
with interesting things only because human beings undergo extensive
search operations to gather information about their local environments,
collecting and correlating that information in their internal brain
structures, compulsively sharing it with others as part of the design
of the distributed computation.  Compulsive sharing creates a kind
of cytoplasm of information for all humans to filter, selectively
reflect, and otherwise use in combinatorial ways.  Because of the
filtering and selectivity, and the previous energy of collection and
computational compression, the information produced and shared by
human beings is vastly more interesting than open generators selected
at random.  Human culture is the longest-known chemical reaction
loop.  Human culture is the only chemical reaction not known to loop or
terminate.  Human culture is the only true "irrational number" of all
discretely-instantiated numbers.







Tue Oct 31 01:23:16 PM EDT 2023

Feynman and practicing with a different box of tools

Same idea as the Max Ent explanation of prophecy

But also the same idea as parable of the falling seeds, reversed in
time; the seeds unfall to the sower, and depending on seed origin
(fertile soil, or barren) the sower becomes either someone who can farm
or someone who knows what it means to be unable to farm.  The knowledge
passes from the earth through the seed into the farmer; the seeds
provide the connection.  The disabled would-be-farmer is disconnected
from that knowledge even though he too has and sews seeds.  His seeds,
though sewn, fail to connect out to knowledge from the past and he may
therefore fail to connect himself out to intentions from the future (or
else not even form them).



The 20th century was spent correlating the implications of a physical
limit of the speed of light.

The 21st century will be spent correlating the implications of
the physical limits of the speed and size of computations.

The human being as a computer system undering phase changes as the
computer gains the ability to represent different types of state -- or
to represent state with different performance characteristics -- through
acquisition of data structures copied from the environment -- OR from
internal processing and DISCOVERY of NEW data structures.

These data structures are PASSED BETWEEN HUMANS who learn them
implicitly and pick them up and play with them.  But data structures
are unsafe when EXECUTED AS REASON and for this reason human beings
have SYSTEMS OF ACCESS CONTROL to HUMAN REASON both internal to their
minds (e.g., concepts of valid and invalid authorities) and external as
social environment.  Society imposes economic exploitation which causes
evolutionary adapations to "bubble up" in ways that are UNPREDICTABLE
IN DETAIL (chaos theory) but according to evolutionary theory will tend
to produce EFFICIENT DISTRIBUTED COMPUTATION so that it will converge
to the computer systems we find most advanced as well as the biological
systems of generating and filtering novelty that we find most advanced
(except that the search space may have valleys etc).

Another system of access control is RUNNING IN EMULATION this is when
the individual learns enough about a foreign system to execute the steps
of its reasoning without however being allowed to reach any conclusions
that apply to the larger brain's data structures.  There are two reasons
why humans cannot rely on this mechanism primarily.

First, EMULATION CAN BE JAILBROKEN; this cannot ever be as secure.

Second, more importantly, RUNNING IN EMULATION IS COMPUTATIONALLY MORE
EXPENSIVE.  Even though CPUs and apparently also human beings have mechanisms
to optimize emulation, in human beings especially, these cannot obtain
"native" performance.  Therefore, computational emulators (e.g.,
learners of a second language) cannot "actually" perform as well as
computational originators (e.g., learners of a first language) if they
use the same underlying computational equipment for the same amount of
time.

But human beings do not all have the same underlying computational
equipment; and they do not all apply the same amount of time to
processing it.  In the real world, running the other side in emulation
is something that more intelligent, more informed, or more adult human
beings attempt to do when interacting with less intelligent, informed,
or adult ones.  Human beings may also believe they are running the other
side in emulation, when they are running a gross simplification; in
fact, they are running a gross simplification even when they run the
remote side natively, since they always still have to emulate the entire
remote environment(!) which is where the real problems start.

Non-portability of language between individuals is a major problem.
Before the internet, locality constraints on communications caused
portability to self-organize locally; but the internet has changed
communication patterns so that every person experiences a kind of
cosmopolis without totality.  Every experience is a scene from a virtual
city which is a construct only of that experience; each event and
corresponding city co-singular; co-existing only once without object
permanence.

One problem is the human tendency to imagination, roleplay, etc.,
causes human beings to pretend communication incompatibilities are
not real.  Human beings must surely have evolved under circumstances
where perceived universality of linguistic forms was vastly more
common than it is today in the adult internet-connected world, though
perhaps less common than it is today in the world of the schoolchild
or university student or professor.

The professors may not make the same naive/incorrect excuses as children
for failing to communicate; their perspectives will be more realistic;
the university system as a whole is constrained in certain ways to
succeed in transmitting information; but insofar as these transmissions
fail, are the reasons understood from a rational information-theoretic
perspective?  Or is it a primate emotion static control program designed
to regulate subordinate behavior emotionally, amplifying the causal
force of the intentions of individuals positioned in social hierarchies
such that their anger generates fear in others?  Or is it a whole series
of task-activated network programs, each one separately influenced
by its own emotional context?  Perhaps they are constrained by
environmental demands to understand these failures operationally


The task-activated networks seem to be the neurological place of
mental compartmentalization; and the ADHD don't shut off the DMN when
activating TANs.  We still "see" the task when others are absorbed
"in" the task.  Of course, in order to influence the DMN, it would
have to be activated.  The TANs feed back into the DMN in ADHD, which
allows the ADHD brain to generate totalizing connectivities by putting
information from disparate parts of universe into the same local
computational system; where for the non-ADHD these same components,
though contained within one BRAIN, are not connected into the same
integrated computational system; the TANs are prevented from feeding
back into the DMN which allows mental compartmentalization to prevent
information from one controlled system to produce interference in
another controlled system when each controlled system is controlling the
same physical human being with a different control algorithm.

In other words, the DMN or the big picture understanding does not
help with, but interferes with, TAN activity downstream of power
in the social grid, because of the way in which this activity is
structured to depend on human beings as removable components,
keeping the environment highly-controlled.  General intelligence is
not useful in highly-controlled environments until they begin to
break down.  High-efficiency local computation requires discarding
global information in order to maximize local connectivity of the
processed information and thus processing speed.  (Principle of
cache locality.)  So as optimization proceeds, the big picture is
squeezed out of every local environment; except SOME privileged local
environment has to be preserved in order to manage the organism's
interaction with _environments_ themselves; this is the executive.
The organism has a consciousness of multiple discrete environments;
each environment controlled by some local control system; each local
control system incorporating its own different own model of human
emotion and behavior as necessary to sustain its specific local
constraints

Emotions are the foundational social control levers in humans.  Not
life/reproduction directly, as it would be in the case of domestic
plants; but emotion/physical-reproduction-of-imaginary-will plays
the same structural role, allowing animalia the meta-evolutionary
advantage of evolving without biological death; emotional sampling with
differential reproduction of imaginations replaces eukaryotic sampling
with differential reproduction of offspring in the information-gathering
social super-organisms of mammalia).

In a school, a student convincing their teacher that they do not belong
in the space to which they are assigned is NOT sufficient to liberate
the student from the space; only a non-local authority assigning
them to some other space can liberate the student from the local
space.  The student having the level of understanding of the system
that would cause them to make this conclusion correctly tends to make
the student even less able to perform in a space where they do not
belong; if the student instead internalizes a false simplified local
model in which the possibility of mis-spacialization is impossible by
construction, then the student may have a better chance of passing
through the filters imposed by the environment for reaching a more
appropriate spacialization.  If the student internalizes a more
realistic, more complete, but externally-referencing (non-local)
model, then compatibility issues are likely in communication with
their teacher; if compatibility exists between the teacher and the
student, then the compatibility issue will exist between the teacher and
administration; or else the administration will have issues with the
school board; or the electoral system; or else the local municipality
itself will drain tax funding since diaspora from other schools will
collect locally.  At every possible avenue where the "exception" could
"bubble up", there will be an incompatible interface, because the
system attempts to impose a constraint that exceptions are handled
non-locally.  All biological systems impose this constraint because of
how it produces a superorganism that is more intelligent and robust than
if its individual components were individually intelligent and robust.
Advanced decentralized computing systems also impose this constraint; it
is a foundational principle of Erlang.

Another principle important probably is that in order to learn a lot
of things you ought to be independently generating them yourself;
the fact that someone has generated something and transmitted it to
someone else does does not mean that they transmitted the generator;
transmitting the generator between people may have more to do with
copying the environment in which the independent generation occurred;
mathematics provides students an environment in which to independently
re-discover the fundamental theorems; but mathematical education outside
of universities does not seem to understand this principle even in
schools that feed top universities.  Students are fed the theorems to
memorize and use without even being fed the raw material from which
the theorems were originally derived.  Thus they are optimizing to
demonstrate a false affectation of mathematical education.  Gresham's
Law again.  Erlang illustrates the structure of passing the generator as
well as the data.


Tue Oct 31 01:59:34 PM EDT 2023

Rappers are only really good at styling up content that they copy from
other places.  They generate novelty only in style, they do not generate
novel content.  Novel content is generated places other than hiphop and
then incorporated there.  People who are competing in social spaces
for the best content do not put that content in hiphop style.  People
competing in social spaces with hiphop style are not competing on
content and do not bring dense content into the competition.