In my quest to understand computing and networking, I have admitted a need to learn electronics. There’s no escaping it any more.
So let’s start with a basic concept: “electric charge” (denoted q). The electric charge of an object is basically its number of electrons. More precisely, it is its number of electrons minus its number of protons. If you think of electrons as debt and protons as credit, the charge of something is its “electron debt”. Thus we say an object has “negative charge” if it has an excess of electrons compared to protons, and we say it has “positive charge” if it has less electrons than protons.
The proper unit of charge is the charge held by one proton, called the “elementary charge”, and denoted e. Thus a proton has a charge of 1e, an electron has a charge of -1e, an ion has some positive integer e, an ordinary uncharge object has a charge of approximately 0e, and so on.
Unfortunately we don’t usually measure charge in electron debt. Instead we measure it in “Coulombs”, but there’s an easy conversion: 1 Coulomb = around 6,240,000,000,000,000,000 e. Conversely, Negative 1 Coulomb is -6,240,000,000,000,000,000 e.
The Coulomb makes it less clear is that charge is an integer and that it derives from number of electrons. Something cannot have 1.5 electrons of charge.
I wrote this because I felt like it. This post is my own, and not associated with my employer.
Jim. Public speaking. Friends. Vidrio.