Your understanding is correct, if you're from the past. You're pretty much describe as it looked like in 1990s.
Yes, many languages can be executed directly by a web server plugin. Right on for PHP, mod_php for Apache is still the most popular way to host it. However, high-traffic sites use more modern approach, using web server only as a proxy for FastCGI (in case of PHP it's PHP-FPM)
the output is embed into HTML pages is then sent back to the client.
I think you're referring to early 90's so called spaghetti code, however modern approach is to use one of many MVC frameworks. In case of PHP that would mean for example Zend Framework (there are numerous alternatives).
As for ASP, you're probably referring to so called "classic ASP", which is obsolete. Currently it's ASP.NET, which can use any of .NET languages (C# being the most popular), and of course the .NET framework.
C and C++ are not typically used for web application. If so, such services are implemented either as stand alone servers, as module for web server or as FastCGI.
Perl can be executed directly from web serve module using mod_perl. There is also PSGI, which is basically clone of Python's WSGI.
Python is very popular language for web apps. It can be directly executed from Apache web server via mod_python, however that is obsolete and not recommended. Currently the way to go with Python is either via WSGI server module. WSGI server implemented in Python (eg. CherryPy, WebPy) or using stand alone Python web stack (Tornado and Twisted's Web module are fine examples). And of course again, you'd most probably will be using WSGI compatible MVC framework, Django is the most popular one (again, multiple alternatives available).
Ruby, again very popular language for web apps. Best known for web framework Ruby on Rails, which again is MVC. You can execute Ruby directly from server module via mod_ruby or via FastCGI.
Servlets/JSP are executed in stand-alone J2EE application servers, such as JBoss or Tomcat. It's more commonly used to add web interface to business system rather than to create stand alone web apps.
Classical CGI (ie. spawning process on each request) has become obsolete many years ago. It has been replaced by FastCGI (where process is long-running, rather than spawned on each request), server modules, interfaces such as WSGI and clones and stand-alone solutions.
Also the paradigm of the request processing has evolved, with CGI it was process per request. Then was process pool (or thread pool), each process (thread) handling one request at a time. However now, most modern approach is for web servers and stand-alone frameworks to use event-driven programming.
Best Answer
I'll expand on my comment.
I think there are a few factors that influenced the use of Python in scientific computing, though I don't think there are any definitive historical points where you could say, "Yes, that is the reason why Python is used over Ruby/anything else"
Early History
Python and Ruby are of roughly the same age - according to Wikipedia, Python was officially first released in 1991, and Ruby in 1995.
However, Python came to prominence earlier than Ruby did, as Google was already using Python and looking for Python developers at the turn of the millenium. Since it's not like we have a curated history of uses of programming languages and their influences on people who use them, I will theorize that this early adoption of Python by Google was a big motivator for people looking to expand beyond just using Matlab, C++, Fortran, Stata, Mathematica, etc.
Namely, I mean that Google was using Python in a system where they had thousands of machines (think parallelization and scale) and constantly processing many millions of data points (again, scale).
Event Confluence
Scientific computing used to be done on specialty machines like SGIs and Crays (remember them?), and of course FORTRAN was (and still is) widely used due to its relative simplicity and because it could be optimized more easily.
In the last decade or so, commodity hardware (meaning stuff you or I can afford without being millionaires) have taken over in the scientific and massive computing realm. Look at the current top 500 rankings - many of the top ranked 'super computers' in the world are built with normal Intel/AMD hardware.
Python came in at a good time since, again, Google was promoting Python, and Google was using commodity hardware, and they had thousands of machines.
Plus if you dig into some old scientific computing articles, they started to spring up around the 2000-era.
Earlier Support
Here's an article written for the Astronomical Data Analysis Software and Systems, written in 2000, suggesting Python as a language for scientific computing.
The article has this quote about Python:
So you can see that Python had already had traction dating back to the late 90s, due to it being functionally similar to the existing systems at the time, and because it was easy to integrate Python with things like C and the existing programs. Based on the contents of the article, Python was already in scientific use dating back to the 1995-1996 timeframe.
Difference in Popularity Growth
Ruby's popularity exploded alongside the rise of Ruby On Rails, which first came out in 2004. I was in college when I first really heard the buzz about Ruby, and that was around 2005-2006. django for Python was released around the same time frame (July 2005 according to Wiki), but the focus of the Ruby community seemed very heavily centered on promoting its usage in web applications.
Python, on the other hand, already had libraries that fit scientific computing:
NumPy - NumPy officially started in 2005, but the two libraries it was built on were released earlier: Numeric (1995), and Numarray (2001?)
BioPython - biological computing library for python, dates back to 2001, at least
SAGE - Math package with first public release in early 2005
And many more, though I don't know many of their time lines (aside from just browsing their download sites), but Python also has SciPy (built on NumPy, released in 2006), had bindings with R (the statistics language) in the early 2000s, got MatPlotLib, and also got a really powerful shell environment in ipython.
ipython was first released in the early 2000s, and has had many features added to it that make it very nice for scientific computing, like integrated matplotlib graphing and being able to manage computational clusters.
From above article:
Good list of scientific and numeric packages for Python.
So a lot of it is probably due to the early history, and the relative obscurity of Ruby until the 2000s, whereas Python had gained traction thanks to Google's evangelism.
So if you were evaluating scripting languages in the period from 1995 - 2000, what were you really looking at? There was Perl, which was probably different enough syntactically that people didn't want to use it, and then there was Python, which had a clearer syntax and better readability.
And yes, there is probably a lot of self-reinforcement - Python already has all these great, useful libraries for scientific computing, while Ruby has a minority voice advocating its use in science, and there are some libraries sprouting up, like SciRuby, but Python's tools have matured over the last decade.
Ruby's community at large seems to be much more heavily interested in furthering Ruby as a web language, as that's what really made it well known, whereas Python started off on a different path, and later on became widely used as a web language.