The first rule of optimizing is to profile your code and then analyze the results to figure out what you should be optimizing.
Turn on the profiling option in config.php and you'll get profiling output at the bottom of every page. Every SQL call goes is recorded as modules.core.classes.Gallery::search so you can see the total number of calls, the average time per call and the total time taken.
In my test environments, even if I have a lot of queries they take a very small percentage of the overall time spent handling the request. I think that it's still to early to do a real optimization pass, but my guess is that a lot of the time is spent in the page rendering code, which is going to get rewritten anyway. Combining multiple queries into one is the kind of thing that makes the code more confusing for a very very tiny performance gain.
There's a caching infrastructure already in place (and in use) to cache expensive lookups. If we find queries that are being executed multiple times that are slow (I don't know of any, at the moment) it is trivial to cache them. See GalleryDataCache and how it's used.
If you're really interested in working on this, it would be very useful it you could start to identify some of the hot spots. Turn on the profiling code and then see if you can find places where we are taking an unreasonable amount of time to do processing. Take the "function_exists" call for example. How expensive is it? Write a script that times 1,000,000 calls to function exists and calculates the time spent per call. Is it significant? What would be the performance increase by caching this value instead?