Local Windows Azure: Integrate, Innovate & Australia just got smarter
Mon, 20 May 2013 02:23:00 GMT
Well folks I’ve been greeted with the news that Microsoft Windows Azure will
be in 2 geo-replicated places here on Australian soil, coming ‘shortly’.
As an Azure MVP & from Breeze (a
leading Microsoft Cloud Partner) perspective we invest heavily in cloud technologies.
What does this mean and why should I care? I hear you ask… good question
and I asked the same.
As most of you know I have a passion for Integration, sticking all sorts of things
together from small RFID devices, hand made hand-held devices, raspberry PIs through
to high end ERP, Financials & many other types of systems. So before I get to
the WHY aspect, let me briefly set the context.
There’s some great data coming out of Gartner a report which caught my eye - http://searchsoa.techtarget.com/news/2240173583/Gartner-Better-collaboration-for-new-era-of-application-integration came
out with these:
Integration Costs to rise by 33% by 2016,
more than half of new system development costs will be spent on Integration
By 2017, over two-thirds of all new integration
flows will extend outside the enterprise firewall.
So Integration just took on a whole new face – successful integration is about
using the right tools (in the toolbox) for the right task. Now we have a
whole new drawer in our toolbox full of Azure goodies & widgets. This functionality
is just too compelling to be ignored….
…and now that it’s on Australian soil I’d be thinking that just about every Data center
service provider should be giving you cloud functionality.
Some quick cloud advantages:
scale, provisioning and ease of use
Imagine being able to spin up a SharePoint site in the time it takes me to write this
Imagine being able to ask for an extra load balanced highly available Server/Service
at the click of a button. Importantly – Imagine being able to give it back again at
the end of the weekend/day/next hour.
Not wait the typical 12 weeks for a new server to be provisioned, oh and dont mention
filling out the right forms. Running an application on those machines and getting
a firewall port opened….that’ll be another 2 weeks…and on it goes.
The much beloved Enlightenment for many companies of achieving Single Sign-On – Imagine
your customers being able to sign into your applications using their own Ids, Live
Ids, + a bunch of other Ids without you needing to provision more services. You can
house your identity accounts in Azure, locally or elsewhere – finally you don’t need
a Quantum Analyst to setup Single Sign-on.
My experiences in the last few weeks on client sites have been back in the world of
old – classic encumbered infrastructure service providers wanting to claim everything,
put the brakes on any new ideas and have meetings around such concepts of adding an
extra 10gb disk space to existing servers. These guys should be ‘can do’ people –
it’s all about choosing the right tool for the job.
Microsoft have done a great job on the developer tooling front from the classic MS
toolset through to Apple, PHP, Ruby, Phython etc. all being able to access, develop
on, publish and deploy.
We could even give a bunch of HDD drives to Olaf (our gun cyclist @ Breeze) to ride
to the Azure Data Center and offload our data, while we wait for the NBN to never
come to our area.
There are some great options on the horizon coming down the track.
So let’s say we’re keen to explore – how hard/easy is it to get ‘my’ own environment
& what does this mean.
The short answer is you get an Azure Footprint which could be running in a ‘Data Center’
in Sydney. Depending on what you’re playing with you could get:
- SQL Databases, Cloud Services, Scalable Mobile Device Services, Load balanced Websites/Services/Restful
endpoints…and the list of ‘widgets’ goes on and on.
How do I interact with this environment:
Often the issue around alot of this is that because my beloved ‘servers’ are running
somewhere else I’m concerned over how much control we get.
We enter into the Hybrid Integration space – where as you can imagine
not *everything* is suited for the Cloud, there will be things you keep exactly as
they are. So there will be many many scenarios where – we have something running locally
as well as something running in Azure. Some options we have available are to make
our servers ‘feel at home’:
VPN connection – we can have several flavours of a VPN connection
that connect our Azure Footprint to our local network. for e.g. local
network is 10.10.x.x/16, Azure network 10.50.x.x/16. Full access to all the machines/services
and other things you have running. CRON jobs, FTP, scripts, processes, linux boxes,
samba shares, etc etc.. (I do realise the integration world is never as easy as we
see it in the magazines)
RDP Connections – standard level of service really from any Service
Remote PowerShell Access
Azure Service Bus - Applications Level Web/WCF/Restful Services connectivity.
An Application Service can run either locally or in the cloud and this feature allows
your Service to be accessed through a consistent Endpoint within the cloud, but the
calls are Relayed down to your Application Service. There’s a few different ways we
can ‘relay’ but the public endpoint could house all the clients & their device
requests, while your existing application infrastructure remains unchanged.
SQL Azure Data Sync – sync data between clouds & local from your
databases. So for many clients, come 8pm each day, their local database has all the
Orders for the day as per normal, without the usual provisioning headaches as the
business responds to new market opportunities to support smart devices.
We even get pretty graphs….
But wait there’s more…..
These details are typical performance monitor counters + diagnostic information. We
can use Azure Admin tools to import these regularly and import them into our typical
System Center does exactly this – so our ‘dashboard’ of machines will list our local
machines as well as our cloud machines. Your IT guys have visibility into what’s going
We’ve been using Singapore DCs or West Coast US with pretty good performance times
across the infrastructure.
What does having a local Windows Azure Data Center mean to me:
Medical Industry – we have several medical clients allowing us to
innovate around Cloud technologies using HL7 transports. Faster time to market and
higher degrees of re-use.
Cloud Lab Manager – www.cloudlabmanager.com can
run locally for all training providers. Breeze has created an award winning cloud
based application that will certainly benefit from this piece of great news.
Creating a cloud based application is now feasible (this particular
one was due to the sensitive nature of information it carried)
And lastly I can house my MineCraft server – well it’s my 10 yr old
sons and half the school I reckon.
So for you…
Ask yourself the question – are you getting all these features from where you currently
host/run your hardware?
Lack of infrastructure and provisioning challenges shouldn’t be holding back new ideas
& business movement. iPads, smartphones, anywhere, any time access should be the
norm, not like we’re putting another person on the moon.
It’s all about using the right tool for the job
Enjoy folks as it’s certainly exciting times for us Aussies ahead!!
MSDN: US prices vs Australian prices–what gives?
Mon, 29 Apr 2013 00:27:38 GMT
While looking into purchasing MSDN licenses for a client here’s what I found:
For the US:
Now when you change the drop down from US to Australia we get these prices (given
that $AUD 1 = (approx) $USD 1
So for e.g. take a MSDN – VS.NET Test.
Aussie Dollar = $3,460 US= $2,170 which
equates to $AUD 1 = $USD 0.627
this is what happens when living in a 3rd world country…. -
Azure: #WindowsGlobalAzure Bootcamp–Sydney has a great day
Sat, 27 Apr 2013 10:28:46 GMT
*** THIS EVENT IS CURRENTLY GOING ON WORLD WIDE even as we speak! ***
The wrap up of the day:
Saturday morning was nothing short of sensational in Sydney today, early morning sun,
bright blue skies, smell of coffee and a city that felt like it was snoozing and waking
for some playtime.
I walked into a room of curious minds, eager eyes and folks that were thinking of
possibilities in technology. This technology was Windows Azure.
We were above capacity & for the first time I would be relieved if there were
a few ‘no-shows’…but none happened. Even at 5pm we nearly had a full house.
Firstly I’ve got to thank – you the students for a great day, fantastic questions
and giving your precious weekend time.
Secondly the expert speakers that have huge experience in the field.
Mark O’Shea – Paradyne
Olaf Loogman – author of a popular Win8 app CyclingTracker – Breeze
Don Jayasinghe – Breeze
Mick Badran (yours truly) – Breeze
and finally all the sponsors & people that helped enable us to
bring this to you:
What were the plans for the day:
The Agenda was set to:
Databases & Reporting
Time & Break
apps on azure
Time & Break
Machines & Networks
Time & Break
Planning Session - questions from the floor
Some Interesting facts:
- we had 3 MVPs in the room (that I knew of)
- we had 2 Microsoft VTSPs
- a student drove 3.5hrs one way to be here with us during the day, then back to Canberra
again after class. Massive commitment.
- we all came with Azure Subscriptions ready to go.
- a student created a WebSite, Database + Worker role working in a solution together
during the day.
- Olaf has his Mobile Services demo fail (even though it worked at 10pm last night)
due to the recent Azure Portal update at 2am this morning. He did have a PlanB, the
autogenerated code from the Portal during the Mobile Services Application creation,
generates un-compilable code for now. Well done Olaf, some nice tap dancing.
(Olaf working his magic)
(looking out to the North Wing)
Thanks to Magnus a fellow Azure MVP - for setting all this up world wide and good
luck to all the other countries.
If you blog about it – then be sure to use the hashtag #globalwindowsazure.
Azure: Public IP Ranges for Azure Data Centers
Sat, 27 Apr 2013 02:44:18 GMT
Something that you’ve always wanted at your finger tips all the public IP ranges for
the Azure Data Centers.
<!--Below address ranges are represented using CIDR notation-->
<!--For detail on how to interpert CIDR notation refer http://en.wikipedia.org/wiki/CIDR_notation-->
<subregion name="North Europe">
<subregion name="West Europe">
<subregion name="East Asia">
<subregion name="South East Asia">
<subregion name="South Central US">
<subregion name="North Central US">
<subregion name="East US">
<subregion name="West US">
BizTalk Stars again–our case study is now up on Microsoft.com
Fri, 12 Apr 2013 06:57:24 GMT
There’s a great Centrebet case study
Azure: Getting started with Azure Backup Services
Thu, 11 Apr 2013 01:08:11 GMT
Recently Microsoft added Backup Services (Preview) in which you can invoke the cloud
as part of your backup strategy, whether it be offsite secondaries etc.
You may have heard of Microsoft’s StorSimple which involved dropping a 2RU or 4RU
Hardware device into a customer’s rack in a Datacenter somewhere which is no easy
The reason why I’m liking the Azure Backup Services approach is that it’s a software
Storage costs for Backups are cheaper and this is a feasible approach for backups.
The other cool thing is that – if I need fast access to my backups in the cloud, then
I can spin up a ‘configured’ VM in Azure (access to the same Backup Vault) and access
the backups. No need to copy them down on premise first.
Let’s get Cracking
The elements that make this Azure Backup Services work are:
1. Azure Recovery Services Backup Services – with a Backup
2. On Premise (or anywhere else for that matter) Server with the Backup Services
Agent installed (currently Win2012, Win2008R2 are targeted platforms for
(Currently the BackupServices APIs are only planned to be used from these Agents and
not our own code….yet!)
3. A management certificate:
1. X509, Pub/Private keys installed in the local machine certificate store in the
2. Public Key (*.CER file) uploaded to Azure Backup Services (this is different to
the Subscription Certificates you may already have up in Azure)
The certificate can be self signed and must have: 2048 (or greater) key length,
expire within 3 years.
(if your cert fails these requirements it will either fail to upload, or fail to be
recognised – we’re dealing with Preview here folks)
1. Creating the Vault
Login to the Azure Portal (activate the Backup Services Preview feature if you haven’t
done so already) and select Recovery Services
- Add a new Backup Vault with your details. It’s point a click stuff
here, no thinking yet.
2. Create the Management Certificate for Backup Services
There’s a few different ways to do this, makecert.exe is
the easiest way I find as follows:
(run from an elevated cmd prompt if required)
Error: Please either specify the outputCertificateFile or -ss option
Usage: MakeCert [ basic|extended options] [outputCertificateFile]
-sk <keyName> Subject's key container name;
To be created if not present
Mark generated private key as exportable
-ss <store> Subject's certificate
store name that stores the output
-sr <location> Subject's certificate store location.
<CurrentUser|LocalMachine>. Default to 'CurrentUser'
-# <number> Serial Number from
1 to 2^31-1. Default to be unique
-$ <authority> The signing authority of the certificate
-n <X509name> Certificate subject X500 name
(eg: CN=Fred Dews)
Return a list of basic options
Return a list of extended options
C:\>makecert.exe -r -pe -n CN=MicksBreezeAzureBackups -ss my -sr localmachine
-eku 184.108.40.206.220.127.116.11.2 -e 12/31/2015 -len 2048 "MicksBreezeAzureBackups.cer"
* you should be able to see this Cert in the MachineCertStore on the local machine
as follows: *
The *.cer file will be on the local file system ready for uploading
3. Uploading the Certificate (*.CER) file to the Azure Portal
From the Azure Portal –> Recovery Services –> Upload Management Certificate
If all goes well, you’ll have success
You should be able to see your certificate details in the Backup Services – click
on your newly created empty BackupVault.
Now we’re ready to get onto the Server Side
3. Configuring and Registering the OnPremise Server to the Backup Vault.
3.1 Download the Agent from Backup Services
Click on the Download Agent Link from within Backup Services and
choose your selection:
Here I selected the first option – “Agent for Windows Server 2012 and System
Center 2012 SP1..”
Download the Agent (approx 17MB) and install.
This should go smoothly.
3.2 Registering the Server
Launch the Agent (if havent done so already) after the above installation completes.
(mine is empty)
3.2.2 Click on Register Server
(Configure a Proxy if you need to, this is for HTTP/HTTPs traffic)
Your certificate should come up in the list that you created earlier – if it doesnt
ensure that both the Private + Public keys are installed AND the Cert is in the Local
Machine Store. Then rerun this step.
Select the Vault details as follows in the Agent
(I’ve hidden my subscription ID here)
You’re 2 worlds are almost connected now, we have the Vault + the Server just about
Click Next to move onto the Encryption Settings
Select a Passphrase and bear in mind that each new Server you add
which wants to restore/read the backup information from another server, will need
the same Passphrase.
Click the magic button REGISTER
This is also reflected on the Backup Services Portal under Servers as follows:
4. Configuring Backing – using the Windows Azure Backup & Throttling
(this is very simple and similar to Windows Backup)
What files are we backing up – click on Schedule Backup
I’ve selected a small folder on the System for the purpose of the demo
Select a Time – Currently limited to a max of 3 times a day per Server.
The COOOOOL THING is click on Change Properties – and here we can
- complete the Wizard to create your first backup schedule – well done!
You’ll now notice the Windows Azure Backup shell has a Backup
Now option on the right hand side.
I selected this and ran the Backup Now ‘wizard’ in which I could also specify Throttling
for this backup.
At this stage you can also go back to the Backup Services Portal and see an entry
in the Protected Items there as well.
5. Powershell Commands – it goes without saying that there’s a ton of powershell
commands to script alot of what we did above.
Digging into PowerShell we find that the commands fall under ‘OnlineBackup’ as follows
– notice MSOnlineBackup
If I simply run a Get-OBJob command we get back some reasonable info
around data transferred etc.
Happy Backuping!!!! Great new Service.
BizTalk 2013: Breeze is ready to help you
Mon, 25 Mar 2013 23:15:49 GMT
With the launch of BizTalk 2013 coming to a city near you from next month, Breeze is across all the new features and has the expertise to get it done right - the first time.
(Watch this space - we will be announcing the launch 'party' shortly)
This is the 8th release of BizTalk and Breeze has been there as a TAP partner, early
adopter etc. for ALL 8 releases (even before BizTalk...but that scares even me). We've
provided feedback and suggestions on the current release and Breeze has the ability
to contact the Product team and raise an issue should the need arise.
BizTalk 2013 is also targeted for the cloud and will be offered as Platform as a Service(PaaS)
and Infrastructure as a Service(IaaS), so keep an eye out in the Azure image galleries
I'm pleased to announce that our current set of products have been updated and tested
to work with BizTalk 2013 environments, such as:
- Breeze Monitor - centralised monitoring based dashboard that gives you comfort
at night knowing your solutions are being looked after.
- Breeze Integration Framework - integration should be easy, this does
exactly that. We have put many new capabilities and items here for the BizTalk 2013
For the on-premises BizTalk Server 2013 release, the following themes are important:
Ability to run existing BizTalk applications in the cloud (IaaS)
Simplified Development and Management Experience
Support for the latest platform and standards
In terms of features, this translates to
Integration with Cloud Services- BizTalk Server 2013 includes new
out-of-the box adapters to send and receive messages from Windows Azure Service Bus,
making it easy to build hybrid solutions. It also provides capabilities to host BizTalk
endpoints in Azure through the Service Bus Relay providing a simple and secure way
to connect external partners and application to BizTalk Server on premises.
RESTful services- BizTalk Server 2013 provides adapters to invoke
REST endpoints as well as expose BizTalk Server artifacts as a RESTful service.
Enhanced SharePoint adapter- Integrating with SharePoint using BizTalk
Server 2013 is now as simple as integrating with a file share.
SFTP adapter-Enables sending and receiving messages from an SFTP
Other enhancements: The ESB capabilities previously introduced in
the ESB Toolkit are now fully integrated with BizTalk Server, Dependency tracking,Improvements
in dynamic send ports, XslCompiledTransform, more support for protocol updates (X12,
There is also a change in the licensing approach, where BizTalk is now moving to
a per-core licensing model. If you need more information on this, drop us
Happy BizTalking.. :)
There’s many signs like this…
Thu, 14 Mar 2013 01:05:54 GMT
You may pass things like:
“Beware falling rocks do not stop”
on the side of the road while on a trip somewhere.
Here’s one I got today from Outlook:
“Was this info helpful?” – love it!
Azure: 6 weeks of Azure (6WOA) just got even more exciting–FEZ Kits
Thu, 07 Mar 2013 01:06:00 GMT
Folks – we’re into week 2 of the 6 weeks of Azure program and as I was planning these
sessions out with Christian last year, I thought I’d like to bring some fun
into the mix.
There’s many possibilities that you can do in Azure, but none other than building
a bit of h/w, programming it & having it talk to Azure! Monitored, controlled
– how good is that.
Enter the FEZ Kits – www.ghielectronics.com
(There’s also the Raspberry PI’s that run a flavour of Linux with a deployment of
Mono – that let’s you run C# code straight onto a $35 computer! – I’ll save that for
These are the ‘mans lego’ kit as I like to think of them as.
What makes these kits cool:
they run a flavour of .NET – .NET Microframework. So yes you can write C# etc that
runs on the device.
you can get many many additional modules for
these to plug into your masterpiece – things like temperature sensors, light sensors,
colour sensors etc.
they run off USB power, even a set of 4 AAA batteries would do it.
you program them via USB cable from Visual Studio.
*rich* community and developer support - http://www.ghielectronics.com/support/.net-micro-framework
FYI – my kit I’ve added to over time and I’ve also got a Raspberry PI that
I play with (good NFC reader).
My FEZ KIT on the left, with the PI on the right in my beautiful Lego box
Now the best thing is that the FEZ Hydra kit (above) will
be available to you (as a prize and the like) on the BOOTCAMPS as part of the 6 weeks
Let me know how you get on and if you’ve got any questions about these guys – they’re
great and good for developers.
--- from the official blurb ----
6 Weeks of Azure
Need in-person Azure Training? DevCamps are for you
Register for a DevCamp in Melbourne | Sydney | Brisbane to
learn how to use the new Windows Azure features and services including Windows Azure
Virtual Machines, Web Sites, and Visual Studio 2012 to build and move a variety of
apps to the cloud. You will see how to build web sites, mobile
applications, and enterprise-class applications.
Need help with your app? Register for a Boot Camp
Register for a Boot Camp near you: Melbourne | Sydney | Brisbane. Our
Industry and Microsoft experts will be available to help complete your
Windows Azure app as part of the 6 Weeks of Windows Azure course.
There will be a FEZ Hydra
Kit or two to win… not to mention some t-shirts and mice to
Azure: An update has been born….
Tue, 05 Mar 2013 10:13:00 GMT
Scott’s team of teams have been busy and have come out with a few changes:
Some of the improvements include:
Mobile Services: Android support, East Asia Region Support, iOS dev content
SQL Reporting Services: Support in the management portal
Active Directory: Support in the azure management portal, user and domain management
Availability Monitoring for Cloud Services. Virtual Machines, Web Sites, and Mobile
Service Bus: New configuration tab and metrics
Storage: Ability to download blobs directly in management portal
Media Services: New monitoring metrics and quickstart experience
Cloud Services: Support for .cer certificate files upload
Localization support for five new languages
Windows Azure Store Support in 22 Additional Countries
Microsoft: There’s something you don’t see everyday!
Wed, 20 Feb 2013 09:44:41 GMT
When I went to the Microsoft Partner Portal – Boom!