Made up a problem; did I screw up?
Been a long time since I did this stuff, so my error could be math or physics based.
Problem: You have a massless charge of q = +1 nC fixed in place. A small object with charge of +1 nC and mass of m = 10^3 kg is dropped from a position 1 m above the massless charge. How close do the two come to touching? Assume the only gravity on the falling object is from Earth.
I'm using g = 10 m/s^2 (didn't want to break out calculator) and k = 9*10^9 N*m^2/C^2.
My first thought to simplify was that I could solve for when the work done by gravity equals the work done by the electric force as that would mean no change in kinetic energy, which means the speed was again 0 (same as starting). That would be the closest they get before the massless particle's repulsion finally overcomes the falling.
So I defined the distance the object falls as x and set the work of the electric force as the integral from 0 to x of kq^2/(1x)^2 dx and the work of gravity as the integral from 0 to x of mg dx.
For the first integral, I got xkq^2/(1x). For the second, I got (10^2)x.
Then I set those equal to each other, and came out with x = 1  (9*10^7) meters, meaning it falls almost the entire distance.
This is only meant to account for basic stuff, so no need to get into the impossibility due to this being smaller than an object with that much mass could be or anything like that.
Thanks in advance for anyone willing to check this out.
