The real problem with talking about the technological singularity is that the classic definition is basically, "Change will be so fast/radical at some point that we can't handle it." So, you've already defined it to be something you can't do anything about, which isn't helpful.
It also assumes that there are no fundamental limits to technological progress. Clarke's third law isn't a law of physics. On the other hand, there are physical limitations to how fast you can calculate, so omniscient computers may not be possible. Without them, we'll always be dependent on the speed of human thought.
There's no guarantee that intelligence can even scale much beyond human capacity. For all we know, there might be some geometric limit on it, like the limits on how big an animal can get on Earth. Maybe greater intelligence becomes more subject to psychoses and ends up being ineffective.
I can tell you that we haven't reached the commonly accepted version of the singularity, because part of the definition is that once you reach it, you can't go back. Humans can still survive without much technology at the moment. When we start tinkering with our genes or implanting computers in our bodies for basic functions, it might be a different story.