Here is a relatively straightforward method using Java 8 streams
Character[] charArray = new Character[1000];
IntStream.rand(0, charArray.length)
.forEach(n -> charArray[n] = randomCharacter());
Map<Character, Long> charCountMap = Arrays.stream(charArray)
.collect(Collectors.groupingBy(Function.identity(), Collectors.counting());
This leaves you with a map from each character to the number of times it occurs in the array.
Forget efficiency unless you are processing billions of characters a second or you are trying to run it on a digital watch from the 90s.
I found answer in further discussion of #36388165.
disclaimer: this does not seem to be announced officially, so may change afterward. also I only test in mysql. but nature of this solution, I think same way should work as in pg module (it seems to accept domain socket path as host parameter)
EDIT(2017/12/7): google seems to provide official early access, and same method still works.
EDIT(2018/07/04): it seems that there is someone just copy-and-paste my example code and get into trouble. as google says, you should use connection pool to avoid sql connection leak. (it causes ECONNREFUSE) so I change example code a bit.
EDIT(2019/04/04): in below example, using $DBNAME as spanner instance name is confusing, I modify example.
in https://issuetracker.google.com/issues/36388165#comment44 google guy says cloud function instance can talk with cloud sql through domain socket in special path '/cloudsql/$PROJECT_ID:$REGION:$DBNAME'.
I actually can connect and operate cloud SQL from below cloud function code.
const mysql = require('mysql');
const pool = mysql.createPool({
connectionLimit : 1,
socketPath: '/cloudsql/' + '$PROJECT_ID:$REGION:$SPANNER_INSTANCE_NAME',
user: '$USER',
password: '$PASS',
database: '$DATABASE'
});
exports.handler = function handler(req, res) {
//using pool instead of creating connection with function call
pool.query(`SELECT * FROM table where id = ?`,
req.body.id, function (e, results) {
//made reply here
});
};
I hope this would be help for those cannot wait for official announce from google.
Best Answer
In the case with Cloud SQL and Node.js it would look something like this:
You would launch
yarn install
and download Cloud SQL Proxy in parallel. Once these two steps are complete, you run launch the proxy, wait 2 seconds and finally runyarn run knex migrate:latest
.For this to work you would need Cloud SQL Admin API enabled in your GCP project.
Where
<CLOUD_SQL_INSTANCE>
is your Cloud SQL instance connection name that can be found here. The same name will be used in your SQL connection settings, e.g.host=/cloudsql/example:us-central1:pg13
.Also, make sure that the Cloud Build service account has "Cloud SQL Client" role in the GCP project, where the db instance is located.