Embarrassed

Folks,

Those who will find this totally OT will be kind enough to pardon me, but very sincerely, I would love this to be true, as I could then elaborate on a great PLT application we could make out of this, I'm really serious. But I don't want to bug people about the latter yet, without knowing if I can assume that sort of foundational formula I'd like to base the other idea upon.

So, first things first: can someone be kind to point me out where is the big mistake I made at the link above? It must be so big I still can't see it..

Thank you bunch in advance for your understanding and/or your help.
CJ

[Edit] Flaw found... Hmph. well, unsurprisingly.. Just made a quick update, though; it was a just first shot, anyway. I'll try have a couple others in the same strategy before I definitely surrender.

[Update @ 2010/12/03, 3pm PST] Second (or rather, third) attempt; I believe it's better... unless I missed something else again. Same link as above.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Why ask us? Ask Coq.

Why ask us? Ask Coq.

Sadly, I don't have Coq or

Sadly, I don't have Coq or other theorem checker within a practical range over here. I do consider the idea and may go for it eventually, yes. Thanks for the suggestion.

Just for the record, I'm

Just for the record, I'm going to leave this attempt at a proof of GC as-is for a while, even if (an)other(s) flaw(s) are found. And stop posting about it, per se, anyway.

As it's likely it would then put the probabilities that I find a correct proof very close to zero... I do believe this last one's better, though(?)

It's just I sort of "dearly want" Goldbach's finding to be true for a concrete application of it (well, yes...)

Well, anyway, time should tell shortly, I guess.

Goldbach's conjecture is

Goldbach's conjecture is verified for quite large numbers, do you actually need it to be true? Even if you do, you can just have your program predicated on the hypothesis that Goldbach's conjecture is true. You'd hardly be the only one relying on unproven theorems. Sure, you'd be boned if it turned out it was false, but probably not as much as everyone else would be if it turned out factorization was easy.

Thanks for sharing your

Thanks for sharing your thoughts.

No offense, Ben, you who replied, too, I could have replied to your comment, but since I said I don't want to pollute more LtU on this subject rather OT for now (until I can be more relevant with an idea of application in PLT, if GC is true) I invoke here the axiom of choice thanks to my will of a free man :) and will just make it short for Derek:

Goldbach's conjecture is verified for quite large numbers, do you actually need it to be true?

True it is verified for numbers probably exceeding the size of universe man will ever be able to know or just imagine. So, you're right, on earth, if we're speaking only of applications of that n = p + q, I indeed don't need it to be "true". But if it is "true" (that is, proved in any tangible formal system we invented and use with consistency; truth being just a relative notion for our brains) then, I can think of consequences of it on other domains of maths/PLT in general and languages more specifically.

That's why I said earlier I'd be delighted to know it reliable in those respects. Otherwise, except for using crypto from time to time in real life as we all do, I'm not a mathematician, I have no specific enjoyment in dealing with these "special guys" (prime numbers).

I stumbled upon the formula and its relation to other systems I can understand beneficiary of it if it holds (again, not about the exhaustive search over n = p + q, but rather about other formulas elsewhere we can assume or build from it).

I wasn't looking for any challenge, there. I was just baffled by the connection I made from a higher point of view, that put me on a path to a possible approach to solve it while still staying at the simplest formal level possible. I very seriously considered the issue before such an effort.

I did know that's the kind of problem that can stay long before being solved until someone looks at it differently. But who dares wins.

I really have no definitive idea if my "proof" is correct or not, of course. It has at least the virtue of staying at a high school algebra level. I did use another theorem, but not that much powerful (in its conclusion) than my own simple arithmetic formulas, and which hopefully doesn't have GC part of its own hypotheses in any way (Bertrand's postulate, indeed a theorem, despite the name).

(Now, sure, if we're unaware of affirming the consequent because Bertrand's relies on GC in some way — I hope not — then my "proof" cannot be one, obviously. It's unclear to me what is the situation, there. However, I guess it is quite clear it does rely upon AC, no matter what.)

I didn't need more for that simple page. I kindly asked some persons much more competent than I am to check it. One person, whose both authority and works in PLT are very well-known here on LtU (*) found a silly, and damaging flaw in my bringing the contradiction, and was very kind to point it out to me very quickly in a private communication. I managed to fix it by actually making the "proof" even simpler.

If it is correct, given the formalism I employed, I have the weakness to believe that we should know shortly when nobody can come with lethal and really easy objections against it's statements or transitions, and then well, yes... it's just we had been missing the thing in front of us because it happens it was blinding us.

Or, if it is inconsistent/incorrect, the same benefit from the simplicity of the formalism should apply, too. No?

Thanks!

(*) he/she will disclose oneself's name only if he/she wants

Just thought I should

Just thought I should mention that Derek gave me the idea to provide a computational interpretation (in pseudo code) to the contradiction I think eventually arises if we assume the Goldbach's conjecture to be false :

same link

It now appears from the beginning of this attempted proof, along with a more formal outline of the overall reasoning (preceding the pseudo code part).

Unlikely...

Goldbach's conjecture has resisted a closed-form proof for more than two centuries, and I find the prospects that someone other than a Wiles-class number theorist is going to solve it rather dim. Still, it's a fun exercise :-). IIRC, it's been proven if you assume the Riemann Hypothesis, and I believe there is a proof that it is true for all N above some very very large bound (around 10^30000 IIRC).

Well, I've reached the point

Well, I've reached the point where I can stop digging into this (and whether this attempted proof is valid or not.)

In my understanding, all seems to behave like GC is a consequence of Maltsev's generalized recursion theorem(B+M), combined with Bertrand's postulate(B+M).

[Edit 12/7/10] (for sake of completeness)

(B+M) Well, that is, if I used them both properly and my construction of the function denotated by NotGC, based on Bertrand's (NotGC is the counter-example model that Maltsev's couldn't sustain) is logically correct; while the one denotated by GC is based on Bertrand's as well and is Goldbach's. So that I suspect if there is a breaking flaw somewhere in my reasoning, that'd likely be somewhere in either one, or more, of these three.