Life is like riding a bicycle. To keep your balance, you must keep moving.
- Albert Einstein
The truth is the best adventure.
- Jordan Peterson
Atheists are right. Just not fully right.
- Unknown
Introduction
The sub-title of this blog is "Ontological Mathematics for Engineers". Usually, I don't emphasize the engineering side of this equation too much, since I assume most of my readers are non-engineers interested in Ontological Math (OM). However, in this post we're going to get a little technical, so it's probably going to mostly appeal to people with an engineering/technical background. Per usual however, if you're not technical, hang in there -- I think you'll still find some useful ideas.
The only thing in this article that might scare off some people is the concept of a y-combinator, which comes from fundamental computer Science. But we will only deal with the essence of what it represents, and I promise, you won't have to know all the technical details to absorb what I will be saying.
In this post, we will explore what a y-combinator is, and how various systems, including OM, make use of y-combinator like properties. We will then discuss the limitations of what a y-combinator can do, in a concept I call a q-combinator. The terminology thus established, we can then compare the difference between sleight of hand, and magic, and explore their respective usage in OM itself.
Lambda Calculus and Turing Machines
The basic difference between theoretical Computer Science (CS) and practical CS is the languages used. Practical CS is based on languages like Java, Python, Lisp, Assembly etc. , while theoretical CS uses mathematical languages/systems like Lambda calculus and Turing machines. In short, theoretical CS is mostly about mathematical proofs, and has very little to do with writing code that actually does anything "useful".
The two big computing models used in computer science are Turing machines and lambda calculus. Turing machines were proposed by Alan Turing in 1936, and lambda calculus by Alonzo Church around the same time . At first considered different models, paralleling a similar split in quantum mechanics a decade earlier between Schrodinger's wave equation and Heisenberg's matrix equations, they were later shown to be, just like their quantum counterparts, equivalent. Turing machines kind of approach things from a mechanical point of view, that is to say models the hardware, while lambda calculus models the software side of things more. Both are not really meant to be "practical" and are simply theoretical models of "real" computers and software. It's interesting to note that Turing, who is commonly believed to be autistic (at least mildly with Asperger's syndrome), would come up with the more mechanical model, while Church, who was not autistic (but was arguably it's opposite: artistic), came up with a more traditional mathematical formulation.
Wikipedia says:
Lambda calculus (also written as λ-calculus) is a formal system in mathematical logic for expressing computation based on function abstraction and application using variable binding and substitution. It is a universal model of computation that can be used to simulate any Turing machine.
The lambda calculus is basically the math of functions and variables.
Y-Combinators
One of the big restrictions with lambda calculus is that it cannot directly handle recursion. Recursion is simply the ability of a function to call itself. While that may not seem like a big deal, consider that in general, any algorithm you are trying to model could be recursive, and also that in pure functional programming (FP), of which the lambda calculus is a member, recursion is used as a way to iterate (loop) over expressions. The reason why looping non-recursively (i.e. using a for/next loop) is "bad" in FP is a little complicated, but has to do with state and side-effects. Suffice it to say that state and side-effects are hard to model when using a system based on pure functions. However, by using recursion you are able to avoid these problems.
In short, not having recursion in a CS calculus is a big restraint, because it greatly limits the processes you are able to model. The advantage of not having direct support for recursion however, is it greatly simplifies the model, and makes proofs easier. So you have to resolve the tension between having an optimally simple language that does not allow for recursion, and an optimally powerful language that does support recursion. Fortunately, in the case of lambda calculus, the y-combinator can be used to emulate recursion(i.e. one step beyond merely simulating it), and thus, indirectly, giving you the power of recursion while at the same time maintaining the simplicity of a natively non-recursive language.
Let me state that again: y-combinators allow recursion to "emerge" from a non-recursive system. This is like having consciousness emerge from non-conscious matter. It's a big deal. This may sound miraculous, but the actual mechanics of the y-combinator are pretty basic.
An important point to note about the y-combinator is it does not add any *new* functionality at all -- it merely "extracts" and exploits existing behavior that is already built-in to the lambda calculus. The y-combinator operates within the parameters and capability of the lambda calculus itself. The "output" of the lambda calculus being "transformed" by the y-combinator is simply the lambda calculus itself e.g. the same as the input, but perhaps with certain latent properties explicated (metaphorically speaking).
When the y-combinator is taught to CS students, it's either presented as a "miracle operator", or as just a rather mundane extraction of existing capability. But which one is it?
The answer, I suppose, is that it is somehow both. Logically, it can be considered magical since it gives you such great power. But physically it's rather mundane in that it is simply exploiting an implicit property of lambda functions and making it explicit.
Here's a diagram of a function recursing (e.g. calling) itself five times.
Diagram 1The important thing to note in this diagram is that there is always just one copy of the function, no matter how many times it calls itself.
Here is a diagram of a function calling other functions: function f, calling function g, calling function h etc.:
Diagram 2Here is a diagram of how the y-combinator emulates recursion:
Diagram 3What's going on here is we are basically creating five copies of the original function and then having each function call itself via standard function transference as shown in diagram 2. There is an obvious disadvantage to doing this way in that if the recursion stack is deep, say a million recursions, you would literally have to have a million copies of the function in memory, instead of just one for a system that directly supports recursion. But since lambda calculus is never used for practical work, this isn't a real problem from the perspective of a theoretical model.
I use the word "emulate" here to mean something that is not quite the same as the original, but about as close to it as you can possibly get. This is in contrast to a simulation which is two steps removed from the original. For instance "3.14" is a simulation of pi, but "3.1415926.." (down to the maximum bit level of the calculation) would be an emulation -- not quite the same as "pi" (the concept) but another level of accuracy beyond a simulation, and for all intents and purposes identical (e.g. isomorphic) to the original.
So the y-combinator kind of exploits an existing property of the system: that the calling of a copy of yourself will yield the same results as recursing on yourself. In effect, it's making an implicate property of the system explicate. The reason it's called a combinator is simply that in the lambda calculus the only way you can get new functionality is by combining functions, so the y-combinator is a function that wraps another function e.g. combines functions. When we apply this basic intuition of the y-combinator as an "implicate to explicate" promoter to other systems we could very well call it the y-operator, or the y-transform, but I like the word combinator because it sounds kind of sexy, mysterious, and cool.
It's also interesting to note why it's called the "y" combinator. I can't say this for sure, but if you think about it, "y" is an upside down version of a lower case lambda character. That is to say the y-combinator extracts the opposite or the inverse dual (be it the co-, anti- or un- flavor of inverse dual) of the underlying base (or nomos). In this case I would say it extracts a transcendent inverse dual, thus we can consider it to be an "un-extractor". See further this post about controlled paradoxes that talks about co-, anti-, and un- versions of inverse duals.
Note, if you google around on y-combinator, you will get a lot of very non-intuitive technical discussions about what it is. You'll also see plenty examples of how you can derive the y-combinator and thus "prove" it. In this post, I present a highly intuitive understanding of the essence of the y-combinator. It took me several attempts at trying to "get" the y-combinator before I finally figured out this intuition. There's an awful lot of mumbo-jumbo out there that doesn't enlighten at all as to what I consider to be the true understanding of this important operator, so be forewarned.
Sleight of Hand vs Magic
Ok, so now we're done with the technical part of this post. At this point we can start to bring it all back to OM.
We can now take the essence of what a y-combinator is and apply this to non-CS systems. As I said previously, the essence of the y-combinator is it makes the implicate explicate. It doesn't create any new properties, just highlights and possibly amplifies existing properties. It is typically done in a very mundane and obvious way. So obvious it might in fact be considered to be "hidden in plain sight". But it actually causes something that many people might interpret as being emergent, or even magical.
So let's look at the terms "sleight of hand" and "magic". Magic is something that has no place in a rational system. It would violate the PSR, just like randomness does. Let's take the case of a magician doing the classic card in the hat trick. They take a card, palm it, then pretend to throw it in the hat, then they show that the hat is empty (first surprise), then they reach behind a person's ear, un-palm the card, and pretend like the card was behind the person's ear all along (second surprise). The "real" (ontological) physical mechanism is sleight of hand -- it's not doing anything extraordinary -- basically hiding the card behind the magician's hand. At the logical or semantic level (epistemological level?), of course, it's (incorrectly) interpreted as magic.
While "sleight of hand" has some negative connotations in the vernacular, here we mean it as a complement: a "hack" for getting a desired behavior -- one that does not involve anything supernatural or irrational. "Magic", on the other hand, while mostly having positive connotations in the vernacular, we interpret as being negative: something involving either supernatural forces or irrationality.
If sleight of hand is chemical, then magic is alchemical.
Knowing the Sleight of hand mechanism of a magic trick takes you one step closer to the truth, empowers you, and is enlightening. Interpretting a magic trick as actual magic removes you from the truth, alienates you, and is endarkening.
So for purposes of our discussion we will consider the y-combinator to be something that is utilizing "sleight of hand", in our positive sense of the phrase.
Q-Combinators
So what if you have (or allegedly have) a combinator of some sort that uses magic instead of sleight of hand to achieve some emergent property? What should we call this type of combinator? Well, we can't call it a y-combinator, since that is reserved for rational, "sleight of hand" transforms only. I propose we call this type of combinator a "q-combinator". Note: this is my own terminology. I originally wanted to call it either a z-combinator or an omega-combinator, but these are actually used by CS already, basically as slight variants on the concept of a y-combinator.
The "q" in q-combinator has two connotations. One is as if it comes from the "q-continuum" (the super advanced civilization from the Star Trek franchise), which is suggestive of the power and "magic" aspect. However, the "q" can also be interpreted as being "questionable". Thus a "q-combinator" is a questionable form of y-combinator, and is usually a term of derision, although still leaving open the (unlikely) possibility that it might someday be proven to be a legitimate transform.
So in summary, we can say that y-combinators and q-combinators allow for transcendent things to emerge from base materials. If the extraction process is rational, "chemical", and non-magical it's a y-combinator. if the extraction process is mystical, alchemical, and magical then it's a q-combinator *1.
*1: Another term for a q-combinator might be WSFM: Weird Science and Fucking Magic.
A good example of a q-combinator would be the materialistic notion that unconscious matter can somehow be transformed into mind. As they point out in OM books, this is a category error, much along the lines of trying to get a floating point number by only adding integers: if you only add integers together you'll never get to pi (3.1415..). *2
*2 Of course you have the whole internet meme that says that the sum of all positive integers = -1/12, but that's mostly playing around with interpretations of the "=" operator (identical, equal, or equivalent) among other fallacies.
Another example of a q-combinator comes to us from the cryptocurrency domain. Namely, the notion that bitcoin is an example of economic asset as opposed to a financial asset. The main difference between economics and finance to me is that economics can create, and is, potentially at least, non-zero-sum, while finance does not create and is always zero-sum. Gambling in a casino is a good example of financial transactions: at the end of the day, if the house is up one million dollars, some person or persons is necessarily down one million dollars. Not a single dollar of economic wealth was created during the day. With economics however, if company A makes a profit, it does not necessarily follow that company B has to lose some profit. It's possible for two companies to grow by creating new wealth, that is to say increasing the size of the overall pie, without necessarily cannibalizing each other's market share.
Economic assets are things like houses and stock. Each has an inherent economic value in addition to a fluctuating financial value. That is to say, if a house for some reason was valued at zero dollars, you could still at least live there e.g. it still has some residual utilitarian value. The same goes with even stock, since it represents ownership of a portion of a company. Thus if you owned 10% of Apple Inc., and for some reason the stock was valued at zero, you would still be entitled to 10% of Apples profits. Of course, what confuses the issue is most things have an economic value and a financial (marketplace) value. But bitcoin has financial value only. If it was financially valued at zero, it would literally be worth nothing -- you couldn't sleep in it, or get a certain percentage of the profits. Bitcoin, like all fiat currencies, was designed to be a purely financial instrument and to be used for financial transactions like buying pizzas only, but people strangely and bizarrely started using it as an economic assets e.g. selling their houses or stock to buy bitcoin for their retirement fund.
The fact that bitcoin, a financial asset, can be exchanged for economic assets is what confuses people: the equal sign in this case is one of equivalence, not equality. Crypto applying an economic paradigm to a financial asset is a category error of the worst degree, and thus a q-combinator, just like science (empirical) applying rationalism (math) to its paradigm.
And when improper paradigms are mixed together, fallacies, paradoxes, illusions, and enigmas are sure to arise.
In summary, bitcoin is using q-combinator magic against a financial asset to erroneously obtain the "emergent" property of it being an economic asset.
Some other y-combinators
So let's return to y-combinators, and look at some examples of systems that, metaphorically speaking, contain their own logical equivalent of the y-combinator -- extracting emergent and transcendent properties using a simple transform.
I think one obvious example is how, at least for integers, multiplication can emerge from addition. Let's say you had a CPU that for some strange reason could only add integers, but not multiply them. You might think that you would get something like an "unsupported operation error" if you entered in something like "5 * 3". But let's say, one day you entered this after some "genius" hacker updated the code, and got back "15" as expected. You might think he somehow updated the system to support multiplication, and pay him handsomely. However, it turns all he did was add 5 to itself three times: 5 + 5 + 5 = 15. This is a y-combinator, sleight-of-hand hack to get the job done. Of course, it will take forever if you type 1042 * 1072 (but could actually do 3 * 1000000 quite quickly because 1 million + 1 million + 1 million = 3 million i.e. only three additions).
Note: multiplication can substituted with addition only with integers. In the case of matrices, matrix multiplication cannot be replaced by some combination of matrix addition.
An even more abstract example of a y-combinator-like transform would be getting a bomb from an airplane. Obviously, as any 911 terrorist or a kamikaze can tell you, a plane can actually used as a "bomb" to destroy a building *3. You don't have to do anything magical, it's an inherent property of a plane (or any large object). Funny how for decades people are blind to this possibility and then suddenly discover and rediscover it, but it's really there hidden in plain sight (no pun intended). Of course, a negative side effect is it also means a suicidal death for the pilots. There's oftentimes a cost associated with a y-combinator. But with a few pocket knives, and the willingness to kill themselves, 19 hijackers on 9/11 were able to inflict maximal damage including wiping a trillion dollars in valuation off the stock market.
*3 one has to wonder if this paragraph will cause this post or even my entire blog to be shadow banned, if not outright banned. I mean seriously, I'm not joking. I really hate to use this terminology. Google algorithm Gods, I swear I mean no ill intent!
Y-Combinators in OM
To tie this back to OM, let's see if there are any ("essential", "logical", or "virtual") y-combinators or q-combinators in OM.
Let's start with y-combinators. I think one obvious candidate is how OM is able to get everything from nothing ("Once you understand nothing, then you understand everything"). In other words, in OM, zero is not a void. Indeed, it's the complete opposite: a totality. On a smaller scale, you have tautologies like 0 = 1 + -1. Note how the left side is zero, but the right side, while overall equal to zero, is not zero: "1" is something and "-1" is something. So this equation shows how "zero" (nothing) can actually be "something". In the extreme, you can think of zero as being the sum of all positive and negative integers, including positive and negative infinity. Then throw in all the imaginary numbers to boot, and then you can reasonably say zero contains all the numbers of the world (or you can substitute sine waves of varying frequencies if you like). Zero, or nothing, can be thought of as a totality, a plenum instead of a vacuum *4. Thus it's easy to extract "something", indeed the entire universe, from nothing. This is the ultimate y-combinator.
*4 A totality of possible things only, however. There are, in fact, an infinite number of things that are not in the totalilty of zero. For instance, the two integers that when multiplied together equal 17 (since 17 is a prime number) will *not* be in the zero totality since they do not exist and never will in any universe for all time (e.g. eternally), although the concept of those two numbers will be in there.
Another candidate for a y-combinator in OM is the Fourier transform. This is how you can get the time domain of matter, or space-time itself, from the frequency domain. The time-domain emerges from the frequency domain. This is how it's normally thought of. However, an equally valid viewpoint is to consider the frequency domain as emerging from the time domain. We'll talk more about this later when we get to the concept of supervenience.
These are examples of where OM shines: in "extracting" things, the opposite things even, from other things using really basic mechanisms via properties embedded in the things themselves, not via magic, mystery, or hand-waving. Absolutely beautiful. And absolutely beautiful even if this isn't how the universe is actually formed, although it's hard to think of any other way once you get this way of thinking.
Q-Combinators in OM
Ok, so what about q-combinators i.e. some questionable transforms, that are perhaps magical that may exist in OM? Well, I can think of one, and it's a pretty big one. But before I get to it, let me say it's only a big one if you accept their own words. I personally would not qualify the thing we are about to talk about as being a q-combinator: I would qualify it as a y-combinator. But by their own words (and in my terminology), they should qualify it as a q-combinator. However, they don't.
The thing that I am referring to is: that the derivation of any conclusion in OM that is derived from non-ontological math, that is to say the standard analytic formal systems that are used to prove Euler's Equation, the properties of sines and cosines, Fourier Transform, Gödel's incompleteness theorem etc. is a q-combinator.
According to their own words, any axiomatic system that is derived from several independent axioms, or that is not based on the PSR (that is to say all of current algebras, geometries, calculi etc.), or has any invalid axiom is not valid. Any "truth" that is derived from a non-truth is not only suspect, but wrong (once again, their words not mine). Euler's Equation (EE) is derived from traditional math which is build upon axions like commutivity, and associativity of numbers. One axiom in particular that they analyzed in one of their books, the axiom "the successor of n is greater than n" they explicitly state is invalid. This axiom is basically saying that "9", which is the successor of "8", is greater than "8", and in general "n+1" is greater than "n". Seems like that's totally obvious. But, as they pointed out, what happens when you have a circular number system where some zenith number wraps around to zero? Here the successor of the zenith number is zero, which is not greater than the zenith number.
I'm certainly not saying that EE, or Fourier transforms are wrong or are q-combinators, just that they are q-combinators from the even more stringent perspective of OM, which is a higher standard than even AM (Analytical Math).
One possible get-out-of-jail card to a math system having multiple non-PSR axioms (as I go into more detail here) is to consider the axioms themselves to be compatible with the PSR, that is to say to consider the PSR to be a meta-axiom. As far a I know, they (the AC) don't suggest this. But the reason I mentioned the "successor of n is greater than n" axiom, is they explicitly say this is an invalid assumption. I don't know for sure that EE is derived using this axiom, but I assume EE is built on top of algebraic theorems, and I'm pretty sure the "successor of n" axiom is part of modern algebraic derivations. Anyway, according to their own words, anything derived assuming the "successor of n" axiom is wrong. Thus, when they all of a sudden start talking about properties of Euler's Equation, or Fourier analysis, this is a q-combinator leap of faith. I don't mean talking about the essence of EE, but tangible numerical results. We can only get tangible numerical results from the Analytic EE, not from some hypothetical Ontological EE (because no such equation currently exists) Like I said I'm ok with using EE, but they should not be by their own words, but almost always they are. Not an attack, just an observation.
Perhaps the "successor of n is greater than n" only breaks down as numbers get close to the zenith number. That is to say, maybe EE starts to break down near the omega point. The solution is to derive EE using pure ontological math. However, I don't know if this is even possible using only the PSR, which is an expression of human level intuition, not a precise mathematical axiom. But certainly this is a topic for an entire post. I hate to have to point this out, but once again I need to be an honest broker. You should just be aware of when you're possibly using a q-combinator when you think it's a y-combinator: always mind your y's and q's!.
I actually asked about the non-ontology of EE on the climate of Sophistry website. I was going to further analyze the exchange here, but seriously, as I previously said, it's worthy of an entire post unto itself, and I really don't want to get into to it right now. However, I would encourage you to look at the exchange, and ask yourself if the response adequately addresses the issue. I mean, I really don't want to be a dick about the whole thing, so I was pretty positive in my response, but just now as I re-read it, I don't think the response really addresses the issue. It basically just pre-supposes that EE is true. And one of the assertions that "all of mathematics is discovered not invented" just isn't true per the OM authors. They explicitly state that all of analytical math is invented, made-up, and wrong. Only OM is true. Anyway, the original responder was just an OM enthusiast from the general audience, and I appreciate his attempt at a response, so I'll just leave it at that.
Update: I now realize that the Climate of Sophistry speaker himself (JP) added a response. I think his response is a little deeper, but I'm not quite sure I follow it. Note: I'm going to leave a copy of the interaction at the end of this post, just in case the original goes away. I actually asked @diabolicallyInformative the same question as well, and I remember him giving me a thoughtful response, but his YouTube channel is now gone, so I can't look at it, and that's why I decided to archive the climate of sophistry exchange at the end of this post. BTW, I believe I remember diabolicallyInformative saying something along the lines that he thought the ultimate solution was to fully "OM" Euler's Equation. Either way, no answer I've received on the question has adequately addressed the issue as far as I'm concerned. However, I do concede this could be due to limitations in understanding on my part, but I really don't think so. That is to say I think we should just accept that it's a q-combinator situation for the moment and leave it at that.
Supervenience
I would like to end with one last concept: supervenience.
Supervenience is the opposite, or the inverse dual of emergence.
I said that in a y-combinator you can think of the embedded thing as being extracted and thus emerging from (or transcending) the underlying system. An alternative is to think of the y-combinator property as being dominant or explicate from the beginning and it in turns supervenes onto the underlying base system. That is to say the transcendent reigns down (or should it be "rains" down?) onto the nomos base.
What is the cause, and what is the effect? Does the transcendent (e.g. god) supervene upon the base (e.g. humans) or does the transcendent (god) emerge from the base (humans)? Is the transcendent the cause and the base the effect, or is the base the cause and the transcendent the effect?
It's an interesting answer to contemplate, and I'm afraid I don't really have an answer. I speculate it's something along the lines of the base and transcendent existing at the same time, and either they simultaneously cause each other, or they are completely independent.
Take design patterns from CS. Design patterns are created by composing classes from traditional OOP frameworks. In effect design patterns are emergent types from the underlying classes. So typically, you might say the properties of classes determine the properties of the patterns. However, the patterns have their own goals, that is to say they are designed independently of the classes. Thus you can say that its the properties of the patterns that supervene upon the underlying classes.
Conclusion
One of the things I really like about learning OM is that it gives me a whole new vocabulary for describing concepts from Engineering and vice-versa. In this post, we exploited this property and tried to tie a fundamental concept from CS, namely the concept of a y-combinator, and tried to extend it to OM to help us further understand it. We then use and interchange the languages between the two systems to describe OM concepts from a CS perspective and vice-versa.
After having established some common vocabulary, we attempted to characterize some properties of OM using the language of y-combinators and q-combinators. We then considered a possible q-combinator in OM, namely the "magical" substitution of Analytic math for Ontological math at certain points in OM's expansion: presumably a controversial topic within the community, to be sure, but hopefully one that is edifying whether you agree or not.
We finally touched upon supervenience, the less well known inverse dual partner of emergence, and ask if the transcendent causes the base or the base causes the transcendent.
As always, my goal is not so much to be an authority on any topic, but to make you think, and encourage you to learn some new things. Hopefully, I have achieved this in this post, whether you agree or disagree with what I have to say, or my conclusions.
Appendix
Comment interchange on the Ontlogical Nature of Euler's Equation from Climate of Sophistry:
mfaustTAN1945
6 months ago
Isn't Euler's equation itself axiomatic? That is to say, it is derived from standard analytical math (not from ontological math), and based on axioms such as the commutativity and associativity of numbers, and that the successor of n is greater than n etc. So once you start making conclusions based on Euler's Equation within Ontological Mathematics, you're implicitly bringing in axioms from a system that is not ontological or tautological. Wouldn't that then "taint" any subsequent conclusions you reach as being non-ontological and non-tautological? Note, not necessarily "wrong" just not fully ontological or tautological.
===========================
HOEL
6 months ago (edited)
I think your comment will likely be addressed in the follow up video, but ill see if I can Illuminate a decent answer.
This is a common philosophical categorical error that gets made between ontological mathematics and phenomenological mathematics.
First of all, all of mathematics is discovered, not invented. Euler's formula is universally true, and we just mapped out symbols to represent that mathematical relationship. Mathematics is inherent to nature, hence ontological. Our invented symbols are axiomatic representations of a mathematical nature, and are just that. Don't confuse the symbols (Phenomenological and axiomatic) with the sinusoids generated via eulers equation (ontological and tautological).
Monads experience their objective math (frequencies) subjectively, hence why phenomology is still relevant. Monads are dual aspect, meaning they are subjects and objects, zeros and infinities, axiomatic and tautological. Axiomatic systems are indeed a part of the entire picture of reality. Its the stepping stones for us as minds to become fully aware of ourselves as mathematical subjects.
Its a really crude explanations but just trying to be appropriate for a YouTube comment... And I'm still learning and honing in on this theory! Can't wait to hear what he says in his next video about these replies.
===========================
@HOEL Thanks for your thoughtful response.
So if I understand you correctly, you seem to be saying there's two "flavors" (or aspects) to Euler's Equation (EE): one "representation" (or actually not a representation, but the thing in itself), in the form of sine and cosine waves projecting off a rotating circle in a complex plane, and this form is syntactical, fully ontological, and tautological, and another one, the EE of analytical math, built from axioms, which gives us a semantic and phenomenological representation of the fully ontological aspect.
So one might concede that this second aspect (the analytical version) is in fact axiomatic, and thus incomplete and inconsistent per Gödel's Incompleteness theorem, but the first aspect is fully ontological and tautological, and therefore complete and consistent. You can then consider the second to be kind of an inferior projection of the first. And maybe as time goes on we can further tweak the analytical version to make better assumptions that more closely approximate the ontological aspect etc. And to avoid confusion, one should indicate which aspect of EE you're referring to so that everyone's on the same page, although in practice it seems that when people refer to EE in Ontological Mathematics they mostly mean the ontological aspect only.
On a quick side note, the reason I mentioned the "successor of n is greater than n" assumption, is that this is only true for linear number systems. In a circular number system, the zenith number (or infinity) will wrap around to zero. So this assumption would not be a good one for a rotating, circular number system (this is per one of their own analyses). So there's at least one axiom in the analytical aspect of EE that is not a good fit. Note: I'm assuming EE is derived using the standard axioms of abstract algebra.
Yes, I see what you mean about not getting too deep in a YouTube comment. I'm not trying to argue, or prove a point -- just understand. I really like your answer, and I too look forward to further discussion in this series.
==========================
Climate of Sophistry
5 months ago
@HOEL Excellent reply.
==========================
Climate of Sophistry
5 months ago
@mfaustTAN1945 Excellent questions. So yes, the movement itself is ontological...the circling is ontological. And yes, we then use symbols to represent the mathematical expression of the circling. But the symbols are merely that, symbols, the symbols are not the language itself. The symbols are not what create the language, whereas with axioms they are what create the language. With the human-written expression, we are simply using symbols to represent the language...the language is primary, the language of the circle, and the symbols are simply used to denote the language.
You are creating a sort of substance dualism between the ontological circling and the expression which represents it. The expression, Euler's Equation, still represents and can only express the circling, and so EE represents what is ontological. An alien species would use different symbols, but the symbols would still produce the ontological circling. So you see how ontological mathematics is universal despite aliens using different symbols for it.
== History
2022-11-04: First publication.
2022-11-09: A few minor footnotes added
Comments
Post a Comment