The relational model was first formulated by E.F. Codd in 1969, and was first implemented in various IBM databases in the '70s. It also looks to me like the first edition of C. J. Date's seminal "An Introduction to Database Systems" was published around 1974.
I was hearing noise about relational databases in the 80s, without even paying much attention. Oracle, Ingres, and Informix were all shipping server-oriented commercial relational databases in the early 80s and doing quite nicely at it.
As for when relational databases were widely used on microcomputers,, well... that's another matter. The most popular microcomputer databases back in the 80s were decidedly not relational - this would include dBase, FoxPro, FileMaker and Paradox.
I would say the transition toward relational databases on microcomputers generally occurred during the 90s, as the ability to network to relational databases hosted on servers became common. It certainly did for me - at the beginning of the decade, I would look rather blankly at people who mentioned relational databases, and by the end of the decade, I was frustrated at having to work with FileMaker (as opposed to Informix, Oracle, or MS Access) because it wasn't sufficiently relational and didn't support SQL.
I'm inclined to agree with you for the reasons you state. Also, if you use the same database, you also get a very practical business benefit - you can easily 'upgrade' a customer to the full version without a complicated data export/import.
One thing you might recommend is trying it first with the one database approach. If that becomes a problem, its far easier to then separate the 'express' data into a new database than it would be if you started off with two databases but later decided to merge them.
Best Answer
Direct answer to your question might not do you much good but it boils down to using triggers if your databases are on the same server, or using your account creation logic to remotely trigger creation on the other application as well. However, I'd like to stress that syncing data can lead to likelier data losses unless some robustness mechanisms are in place (scheduled synchronization, repeated attempts at triggering the creation, etc).
Almost always a better approach would be to maintain a single source of truth. However, this does not avoid complexity either.
Here are some approaches you could go for:
Assumptions
Given the formulation of your question, I'm going to assume all of the user accounts should be replicated on both applications. E.g. there are no app-specific user accounts.
Approach #1: same database server, separate database
A straightforward approach would be creating a third database that contains shared data for both applications, in this case the user accounts. You could reference the database from your usual queries, since it's all on the same server. (If we assume you're using Django, on your User model you'll need to specify the database. To do this, you'll have to look into replacing the user model for the default auth system which is more of a hassle than I'd like to admit)
Approach #2: separate authentication service
You could maintain a separate application (a service, to be precise) which would be queried over a secured connection to prove the authentication claims. This would introduce significant complexities, as you'd have to authenticate your sessions manually, which is somewhat tedious as you might be used to it being automated by popular Python frameworks.
A more worthwhile option might be looking into third party authentication services such as https://auth0.com. You can find Python documentation here
Disclaimer: I am not affiliated nor experienced with Auth0 services, they're just given as an example.