Ok so I got this far - I'm not posting this as an edit to the question, on the off-chance that although this appears to be on the right track, there might be a better way than what I've been working on. Figure I'd let democracy decide!
Using this link I was able to figure out the format of the XML file that should be used with the setParamFile
switch for msdeploy. I'd also, in the past, figured out the format for the declareParamFile XML by using the embedded GUI within IIS after installing the Web Deployment Tool.
So, given a site called 'SiteA', with two binding entries in the applicationHost.config file as follows:
<bindings>
<binding protocol="http" bindingInformation="*:80:" />
<binding protocol="https" bindingInformation="*:443:" />
</bindings>
(Which means, specifically - any IP address on port 80 and any IP address on port 443)
The actual cert being used is not stored in applicationHost.Config, but instead in the configuration for Http.sys (according to this article). When msdeploy prepares a package for the site, it will embed that information - which might not be a blessing as I mention at the end.
The first step is to declare a parameters xml file that we will use to parameterise a single package for the target live servers:
<parameters>
<!-- declare parameter for Http Binding -->
<parameter name="SiteA-http" description="SiteA Http Binding">
<parameterEntry kind="DestinationBinding" scope="SiteA" match=":80:" />
</parameter>
<!-- declare parameter for Https Binding -->
<parameter name="SiteA-https" description="SiteA Https Binding">
<parameterEntry kind="DestinationBinding" scope="SiteA" match=":443:" />
</parameter>
</parameters>
Note the 'match=' attribute values on the two inner parameter entries. This ensures that the correct binding is replaced. This is a Regex (as described in this technet article) that selects the existing binding values that are to be changed with the parameter value that will be passed in a moment.
We save this as declareparameters.xml
.
With this in place, we can now generate a parameterised package, from our staging box, from which we can then deploy, using this command line (this is to 'image' a whole IIS within which our SiteA is present):
msdeploy -verb:sync
-source:WebServer,computerName=localhost
-dest:package="parameterised.zip"
-declareParamFile:declareparameters.xml
If the web site is on a different web server, replace 'localhost' with that web server's name. The Web Deploy Agent service has to be running on the target machine for this to work.
Now, we declare a parameters xml file that will actually provide parameter values for a deployment to a live server:
<parameters>
<setParameter name="SiteA-http" value="[fixedIPAddress]:80:"/>
<setParameter name="SiteA-https" value="[fixedIPAddress]:443:"/>
</parameters>
And we save that as
[targetServerName]parameters.xml
(in my case I have two target servers, so each gets its own parameters xml with a different file name, and slightly different IPs in each).
Finally, we can perform the parameterised deployment to the target server(s) with this command line:
msdeploy -verb:sync
-source:package="parameterised.zip"
-dest:WebServer,computerName="[targetServerName]"
-setParamFile=[targetServerName]parameters.xml
So now we can change the IPs of either the Http or Https Binding and, if the originals are sufficiently different, we can parameterise any number of individual bindings that might be required for that site.
This has one drawback so far - so any alternative answers appreciated please - the SSL configuration is copied from the source machine into the package - meaning that in order for the SSL config on the live site to be correct on deployment, both the staging machine and the live server(s) must use exactly the same SSL certs.
What would be great is if the staging box could use a self-signed or internal cert for sanity checking, and then have the real SSL cert be applied on the actual deployment - again, parameterised from the XML files.
SCCM is a tool for OS Deployment, Software deployment, Hardware/Software Inventory, Software (primarily windows) Updates. SCCM Definitely has some quirks, but is VERY good at it's job. What will eventually happen is that something will click and you'll finally "Get It" and things will become much easier. I would recommend sticking with it a bit longer.
SCOM Solves your Operational Monitoring need. In fact there are SCOM MPs from Microsoft (built primarily by the actual product teams for the products they monitor) for all of the software you mentioned.
SCOM does not directly "Aggregate event logs so that there is one central place for monitoring the health of an organization" as this is inefficient and unnecessary, you will write rules and monitors that perform the necessary monitoring without simply collecting everything first.
I would strongly recommend finding a System Center User group in your area. These groups are invaluable sources of contacts that know what they're doing in System Center and are usually happy to help. I would also look at http://www.myITForum.com as an excellent source of System Center information. Finally, if you can swing it, try to go to the Microsoft Management Summit in 2012 as the single best source of System Center technical information and training around.
If you have a Microsoft EA, check with your TAM and see if they can get you some sort of Quick Start deal where they'll pay for some on site consulting hours if you add SCOM licensing. You may also already have DPS days (I think that's what they're called) included in your contract, leverage these!
Best Answer
You're correct, the contents are the same. However the 'for Hosting Servers' package will install a number of other dependencies such as IIS, if not already present.