Comment on This Post:
Should say "False"
I was just following the original.
Fair point, I should've found that one!
Wrong, though C is an arbitrary constant the commonly accepted practice in international mathematics is to assign the variable as a capital C, not a lowercase c as is depicted in this post.
I wanna see a response deathdark!
dis gon be good
I responded to his, but thought you may want to know.
I'm going to be 100% honest. Both of your responses are waaaaay over my head. But I do appreciate your diligence.
I wouldn't begin to know where you don't understand. After 3 years of higher math, I forget what it's like not knowing what's going on.
Exactly, I just don't "speak the language" so to speak.
Rofl, that gif made my day.
I try :)
While methods of expression may vary -- from c to C to ¢ -- they still represent a constant that may or may not have existed in the original function. Thus, it cannot be said to be wrong, unlike leaving the constant out to begin with.
True enough -- but if you're conforming to the international standards for mathematical expression than by definition you must conform to their standards of variable representation; even if the "variable" you're representing is purely speculative and symbolic of an arbitrary shift on the y-axis. If you're going to use the symbol for integral as in this image, than you can't deviate from the standard for the constant either, which is strictly capital C.
I can. I just did. I can use Newton's notations mixed with modern if I wanted, too. I could call it n or x or even e if I want, but I chose c because of the lack of ambiguity. Arguing over notation, especially one this frivolous, doesn't detract from the accuracy of the statement.
Good luck trying to impress the IMU. When you get your fields medal in...ahem..."abstract" calculus I'll be sure to congratulate you.
Yes, really. This is the notation I learnt and used through college, and at no point did a teacher even bat an eye. And considering my primary teacher has been teaching higher math longer than I've been alive, good by him is good enough for me. If some group of mathematicians doesn't like the notation, so what? The notation is still not mathematically wrong, it's just not exactly how everyone does it. In high school, my teacher taught us using ¢ to denote it wasn't a variable. I didn't like it, so I dropped the slash, and she said it was okay, because it means the same thing in the end.
This is literally more silly than complaining because someone left out a , just that DOCTYPE can actually be used for something.
I understand that! You're assigning a variable to a value that is both impossible to determine the existence and numerical value of from just an indefinite integral. I'm specifically making the point that it's the commonly accepted practice to follow the standards of the scientific community for anything in the field of science and mathematics and that if you don't you're (not conceptually) wrong -- as I said earlier in this thread. For example, the official international standard for a kg is the mass of a platinum-iridium cylinder which is kept near Paris. You can technically call anything a "kg" because it's a human conceptual-invention anyways but if you're going to conform to the common methods of notation you have to do that 100% or you're going to have inconsistencies in your work and your colleagues often will neither respect nor be able to understand your work.
And it's actually this relates exactly the same exact to the idea of whether it's really necessary to put a into your code. You can technically leave it out but in the CS community it's the commonly accepted standard for modern programming to include it. You don't HAVE to conform to that standard but, as in the example in this image, it's more logically sound (especially at the higher levels so people can follow your train of thought) to conform to the exact standards and practices; even if they're arbitrary. Regardless, you clearly know what you're talking about: I was just trolling you for a laugh like 6 posts ago and didn't expect this conversation to get so serious. Cheers, my friend.
It started as a joke, yes, but you've kept it from staying so by repeatedly insisting I'm wrong, and especially with the "abstract calculus" comment.
Your metaphor is fundamentally flawed, though. I'm not representing the wrong thing, the c is still the constant of integration. As I've said before, there is absolutely no ambiguity with using a lowercase letter over a capital letter in this case. Much the same, my versions of discrete mathematical symbols are probably different from your "international standards", as that's how they were taught to me. However, a difference in notation is never a difference in truth. If a and b represent c and c is true, then a and b are true (transitive property).
I'll not address the CS statement, as that would take too long, and would be defeating a metaphor.
I notice I tend to sound a bit angry when I get into the arguing mood... and though a few things you've said were slightly frustrating, I have no ill feelings. The subject seems to be wearing down, so it probably is time we cut it off.
You seem to be well versed in these things, though. It's nice to see another mathematical literate running around.
I wouldn't say you sound angry. More like this:
This is long overdue, but after our little chat here, I continued to notice that the common convention is using "C" for the constant of integration. As such, I have adopted it in order to become "more correct," so to speak.
I see. For the record, I agree with you that to adopt a standard for an arbitrary constant is relatively pedantic considering that the arbitrary constant in question symbolizes something non-standard. Still though, it never hurts to use a common convention; especially when you get into substituting three or four times and/or you run across an integral with a function of c(x). What kind of projects are you working on at the moment?
I'm supposed to be making a Lolbrary app, but real life is getting in the way.
Dude, you're so freakin awesome lol.
who gives a shit?
Apparently you don't, oh wise ignorant douchebag.
that bitch has perfect proportions. damn