The quick answer is: The series connection produces more light. This can be determined with simple reasoning as follows:
(1) The current is fixed and
(2) Series connected elements have the same current through them
For the parallel connection, each bulb gets half the source current while, for the series connection, each bulb gets the full source current. Since power is proportional to the square of the current, each bulb in the parallel combination produces 1/4 the light of the bulbs in the series combination.
Also, if you understand that current is dual to voltage, resistance is dual to conductance, and parallel is dual to series, you can immediately get the answer by duality.
For example, consider the following:
When connected to a voltage source, two light bulbs in parallel produce 4 times as much light than two light bulbs in series.
Here's the dual:
When connected to a current source, two light bulbs in series produce 4 times as much light than two light bulb in parallel.
Since the first sentence is true, and since the second sentence is its dual, the second sentence is also true.
The bulbs will only appear brighter if the available current to the system is not limited. In that case the series bulbs will have a lower voltage across each individual bulb and they will appear dimmer. If the power input to the circuit is a constant than the total wattage output from all bulbs is also constant and the bulbs will all appear the same (assuming the filaments for the bulbs are all identical resistance).
In a typical simple circuit the power source will be a battery which attempts to hold a constant voltage across the circuit. In this case the voltage across the bulbs in parallel will be equal to the voltage of the battery and the current through the bulb will be defined by $V = IR$ where $R$ is the resistance of the filament. This means more current (and thus more power) will be drawn from a battery into the parallel circuit than a series one and the parallel circuit will appear brighter (but will drain your battery faster).
Best Answer
If the bulb is lit when you pour water on it, it will undergo thermal shock. Some parts cool down and shrink, while other parts are hot. This causes very large thermal stresses, which can break the bulb. Depending on the material and construction of the bulb, this may or may not result in a spectacular implosion (once the bulb cracks, the vacuum inside will be filled with air and the water you poured on the bulb). The water will hit the very hot internal parts, and boil instantaneously. This can make a big mess. Certainly it is not an experiment you should do without extreme precautions.
I used to make CT tubes in a previous life. These are large vacuum tubes with a massive anode that can soak up several MJ of energy (enough for an entire CT scan), at which point they reach a temperature of around 1200 C (meaning they glow "white hot"). One of the safety tests we had to do was to evaluate the behavior of the tube and its enclosure if the vacuum envelope failed and the insulating oil surrounding the tube (which acted both as dielectric insulator and cooling medium) would be sucked into the vacuum, and instantaneously come to a boil against the anode. It was one of the most spectacular experiments I have ever participated in; it was done in a sealed room, with video cameras and other instrumentation all around, in their own sealed boxes. The outermost container of the tube had a baffle that could absorb the expanding oil in normal conditions - in abnormal conditions it would cause boiling oil to spray out, safely directed away from the patient.
Believe me, you don't want to do that experiment unless you really, really know what you are doing.