Wednesday, May 23, 2018

GPUG Amplify South Africa 2018 - Preconference Day 2

Friday, May 18, 2018

What would a delectable breakfast look like without Vegemite?

Toast bread with butter and Vegemite

After breakfast, we were picked up by Adriaan and his cousin, to be taken to The Canvas Riversands Venue, a modern business incubation hub, for our Developer Day Session with the Braintree software development team. This magnificent facility will also serve as the primary venue for GPUG Amplify South Africa 2018.

The Canvas Riversands Venue

Panoramic view of the outside theatre at The Canvas Riversands

The day started by requesting developers to propose the topics they wanted to cover. As topics were voiced out, we added them to a PowerPoint slide we created on-the-fly, with an Agenda title. We have to confess, this was a little intimidating as we weren't sure what topics would be requested, nor did we prepare any material to address any of the items requested. This would be a freeform presentation and we would have to catalog each topic in subject matter segments that we could reconcile into coherent units. At the end of the topics request, the agenda took the following shape:

Microsoft Dexterity
Source Code Control with Visual Studio Team Services
Packaging and Deploying Dexterity Applications

Deploying Microsoft Dynamics GP on Azure
Developing Azure Functions 
Leveraging PowerApps and Flow

Azure Stack 
Azure Blockchain

Microsoft Build 2018

During the first half of the day, we cover all the Microsoft Dexterity proposed topics and went through an in-depth look at the software development lifecycle of a Dex application.

From left to right: Ross Pelser, Deon Dalebout, Pieter Cornelius, Ben Strachan, Loodt Van Niekerk, Adriaan Davel, Christo Booysen, David Musgrave

In the back, from left to right:
Ben Strachan, Deon Dalebout, Ross Pelser, Adriaan Davel, Loodt Van Niekerk, Adriaan Van Maanen, Christo Booysen
In the afternoon, we covered the remaining agenda items and showed practical examples of developing Azure Functions and PowerApps and Flow.

The team had a special interest in understanding the trends emerging from the recent Microsoft Build 2018 conference in Seattle. We touched up on the AI powered cloud, which stood out as primary topic, then looked at Microsoft Azure Stack and the Azure Blockchain Workbench.

Back at the hotel, I took a power nap, then met up with Adriaan and Liezl Davel for dinner. We spent a lovely evening at the excellent Kai Thai Restaurant talking about life in America - Adriaan and Liezl are in the process of relocating to the USA. After a long day, it was time to call it a day and be up early for the next adventure.

Until next post,

Mariano Gomez, MVP

Tuesday, May 22, 2018

GPUG Amplify South Africa 2018 - Preconference Day 1

Thursday, May 17, 2018

The new morning found us having breakfast at the hotel's restaurant. Prior to breakfast, David and Jennifer stopped by my room to drop off the Australian imports, mainly Allen's Chicos and Cadbury's Cherry Ripe. I also had to unload a Microsoft MVP jacket for David, which I had been storing for over a year.

With Jennifer and David Musgrave
Shortly after breakfast, we proceeded to leave the restaurant and take a walk around the premises. The Montecasino complex is primarily a 5-hotel complex with a casino, any number of restaurants, bars, business meeting rooms, a performing arts theater, and even a movie theater, all designed to provide a unique experience that can be enjoyed by families and business visitors alike.

The Piazza
The place has inevitable Las Vegas air and flair, comparable to that of The Venetian hotel. Proof of it is the styling of the buildings and overall décor.

With Jennifer and David

I met up with my Mekorma colleague in South Africa, Adriaan Davel. Adriaan is a Senior Software Engineer and reports directly to me. It was good catching up with him and we were both able to plan out some Mekorma-related work activities for the week and get to talk about dinner plans.

With Adriaan Davel
Later in the evening, the third contingent of international visitors arrived at the hotel. This time is was Alicia Weigel from Rockton Software and Katie Froeber and Angie Ryan from GPUG. The ladies checked in and went to freshen up and, after a few minutes, joined in downstairs for some drinks before dinner.

From left to right:
Mariano Gomez, Alicia Weigel, Jennifer and David Musgrave, Katie Froeber, Angie Ryan, Pieter Cornelius, Deon Dalebout
Deon left for the evening as he had a 2-hour commute back to his home. Pieter joined us for dinner, but left before we ordered dinner was served as he had already ate. We all left for The Metropolis Grill and Bar (The Met Grill) inside the Montecasino complex as everyone had early morning activities planned, so something close was appropriate.

Mariano Gomez, David Musgrave, Jennifer Musgrave, Katie Froeber, Alicia Weigel, Angie Ryan
Katie, Alicia, and Angie had plans for a safari the following day, and David and I would be spending a day with the Braintree development team, covering a number of topics of interest to them.

Until next post,

Mariano Gomez, MVP

Monday, May 21, 2018

GPUG Amplify South Africa 2018 - Getting here

Wednesday, May 16, 2017

Living in beautiful Atlanta, I have the opportunity to live close to the busiest airport in the world, Hartsfield-Jackson Atlanta International Airport (ATL). ATL also happens to be the home of arguably, one of the world's largest airline, Delta Airlines. One of the primary benefits of this unique airport-airline combination is that Atlantans very used to direct (non-stop) flights out of ATL to just about anywhere in the world.

As such, Delta features a direct flight from Atlanta to Johannesburg, non-stop! Although, this is comforting, the 15.5 hours trip is a test for anyone remotely suffering from some form of claustrophobia. A big part of overcoming the long trip is quickly engaging in some food eating, paired up with a good dosage of movies. Shortly after boarding and take off - around 6:30 PM on Tuesday, May 15 - I began watching 12 Strong, the true story of a US Special Forces tasked with taking out the Taliban in the immediate aftermath of the events of September 11. I must say that this 2 1/2-hour movie was very compelling and action packed and certainly made the first couple hours of the trip pass by pretty quick.

However, I just decided after the movie, that I would close my to try to sync up with my destination's time zone to minimize the effects of jet lag, which is rather unavoidable on these types of trips.

Flight tracker over Georgetown, Asuncion Island

Between the various cabin service passes I manage to close my eyes and fall asleep. 8 hours later, over Georgetown, the capital of the tiny Ascension Island, a British enclave between Africa and coastal Brazil and a part of the Northern Mariana Islands, I definitely felt the proximity to my final destination, Johannesburg.

One thing about the flight between Atlanta and Johannesburg is your ability to surf the internet via the Gogo Inflight Wi-Fi system. It helped to know I could update my wife in Atlanta on the progress of my flight, while maintaining communication with my friend and conference organizer, Pieter Cornelius in Johannesburg. Frankly, any little thing you can do to make the time go by on these lengthy trips is totally worth it.

4 more hours and I was over the Namib Desert in Namibia (not to be confused with the imaginary republic of Nambia 😋) which is a vast expanse running some 2000 kilometers (1200 miles) along the Atlantic coast of South Africa, Namibia and Angola. I handed my camera to the lady sitting at the window, who managed to snap this beautiful picture under clear skies.

Namib Desert
20 minutes later, I would land safe and sound at O.R. Tambo International Airport (JNB) at around 5:30 PM local time. I was delighted to see Microsoft Azure banners hugging the columns in the passport control area, a sign of the progress the company is making in expanding its cloud data center presence in South Africa.

Microsoft Azure Cloud banner

Passage through passport control was a breeze and exiting the customs area was uneventful. This was a much welcome experience considering I had just spent 15.5 hours in the air and I was a bit tired to say the least. I boarded a taxi cab, not wanting to wait for Uber to go to the Sunsquare Montecasino hotel. This trip was accompanied by the usual Johannesburg rush hour traffic, complicated by the fact that Lionel Messi and his FC Barcelona were in town to play a friendly against the Mamelodi Sundowns

Sunsquare Hotel
Check in was quick with my friend Pieter waiting for me in the lobby. We exchanged pleasantries, had a couple drinks at the hotel bar after I was settled in. Following a couple drinks, we went to Texas Wing Bar to get some finger food.

I had left instructions with the front desk to have my friend and fellow MVP David Musgrave, who was arriving a little later with his lovely wife Jennifer, call me upon his arrival. Once David checked in, he reached out, but by then I was burnt out and ready to call it a day.

Until next post,

Mariano Gomez, MVP

Monday, May 14, 2018

GPUG Amplify South Africa 2018
Tomorrow, I embark on a 16-hour journey to beautiful Johannesburg, South Africa, to attend the first GPUG Amplify event powered by Dynamic Communities. I am looking forward to meet up with the Microsoft Dynamics GP African community and my friends traveling from the USA and Australia (yes David, that's you!).

Last year, this event was branded as reIgnite GP and was the brainchild of my friend Pieter Cornelius (Twitter: @pmcornel) with Braintree, a Vox company. Pieter and I worked tirelessly last year to bring some of the most talented individuals in our community and they all answered the call without hesitation: Alicia Weigel (Rockton Software), Miguel Lozano (Titanium GP), Chris Dobkins (Njevity), Lak Chahal (Binary Stream), and David Musgrave (Winthrop Development Consultants) among others, to make this a well attended event - over 100 attendees!

Joining in the success of last year's event, the GPUG community decided to step in and amplify (pun intended) the exposure of reIgnite GP by throwing in all their weight behind morphing and launching the first GPUG Amplify outside of the North American continent.

As is customary, I will be teaming up with my long time friend, co-presenter, and fellow MVP, David Musgrave to deliver a number of sessions. Here's what we have on the agenda:

Monday, May 21

10:45am-11:45am @ The Canvas Riversands
Microsoft Dynamics GP 2018: Customizing The User Interface

2:00pm-2:30pm @ Lecture 2
Partner Solution Showcase: Winthrop Development Consultants – Simplify Administering your system with GP Power Tools

2:45pm-3:45pm @ The Canvas Riversands
Yet Another 25 Dexterity Development Tricks and Hacks in 45 Minutes

Tuesday, May 22

8:30am-9:30am @ The Canvas Riversands
Ask Us Anything About Microsoft Dynamics GP Development

9:45am-10:45am @ The Canvas Riversands
Developing Microsoft Dynamics GP Extensions w/ Microsoft PowerApps and Flow

11:00am-12:00pm @ The Canvas Riversands
Microsoft Dynamics GP 2018: Customizing the User Interface (REPEAT) 

12:15pm-12:45pm @ Lecture 2
Partner Solution Showcase: Winthrop Development Consultants – Leveraging GP Power Tools as a Developer

2:45pm-3:45pm @ The Canvas Riversands
Understanding Microsoft Dynamics GP Security

Wednesday, May 23

8:00am-12:00pm @ The Canvas Riversands
Academy – Introduction to GP Power Tools

1:00pm-5:00pm @ The Canvas Riversands
Academy – Advanced GP Power Tools Tricks and Tips

Minutes before writing this blog, I received a call to action from Bob McAdam to deliver the keynote speech to the event, in case the current presenter cannot make it. To say the least, I am pumped and looking forward to assist where I can. I am also most looking forward to our PowerApps and Flow session, as this is new and preamble of what to come at the GP Tech Conference in Fargo, ND.

You can access the full conference agenda here.

Finally, I want to call your attention to the promo video of the conference:

Looking forward to see you there!

Until next post,

Mariano Gomez, MVP

Tuesday, May 1, 2018

Deploying Microsoft Dynamics GP on an Azure SQL Managed Instance - Part 3/3

In Part 1 of this series, I walked through the process of signing up for an Azure SQL Managed Instance preview (further referred to herein as "Managed Instance" for simplicity sake) and the subsequent deployment process once approved. In Part 2, I looked at some of the issues you would ran into during the deployment of Microsoft Dynamics GP, particularly around the use of Dynamics GP Utilities to create an environment from scratch.

From reading these two articles you may have concluded -- rightfully so -- that it's not possible to deploy Microsoft Dynamics GP using a Managed Instance PaaS. This article focuses on a deployment approach that works.

Migrating Microsoft Dynamics GP to an Azure SQL Managed Instance
Since Dynamics GP Utilities is not yet equipped to create databases on a Managed Instance, the only way to take advantage of this new service is via a migration. This is, taking your on-premise databases and restoring them onto the cloud database service.

I first started by backing up my on-premise system and company databases to an Azure storage account. There are couple ways to do this: a) you can use Microsoft Dynamics GP's built-in backup functionality, or b) you can use Microsoft SQL Server to perform this operation via T-SQL statements or SQL Server Management Studio. I chose the first method, but for more information on the second method, you can refer to MVP Steve Endow's article Back up your Dynamics GP SQL Server databases directly to Azure Storage in minutes!

I first wrote about backing up Microsoft Dynamics GP databases to Azure using the built-in backup utility in my article Microsoft Dynamics GP backups with Windows Azure Blob Storage Service. Nonetheless, I will cover the subject once more in this article as I am sure the Azure Portal has changed since I first wrote that article.

During the Managed Instance deployment steps, a storage account was created to store the diagnostics log, so I figured for ease of explanation, I would use this same account to store my Dynamics GP backups, but I strongly recommend you create a separate storage account (as a part of the same storage resource group) to store database backups.

Backing up the System and Company Databases

1. Once your storage account is in place, open Microsoft Dynamics GP and go to Microsoft Dynamics GP > Maintenance > Backup.

2. Enter the storage account name. This can be found in the list of resources in the Azure Portal.

3. Enter the access key. The access key can be found by clicking on the Storage Account under resources, then by clicking Access Key under Settings in the Azure Portal. You can copy the key and paste it in the Back Up Company window in Dynamics GP.

4. Enter the URL to the container. Click on Containers, under Blob Service for your storage account. Proceed to click on the blob name displayed. Then click on Properties under the blog Settings.

5. Finally, you can enter a File Name in the Backup Company window then click on Verify Account, then OK to complete the backup.

You will need to complete this process for the system and each company database in your environment.

Restoring the databases onto the SQL Managed Instance 

With the first part completed successfully, we now have to restore the databases onto the SQL Managed Instance. To show you how this part works, I found it useful to create this short video:

With the databases restored, I fired up Dynamics GP Utilities. The database version validation completed successfully and I was sent to the window where I could run additional Utilities functions - again, creating companies is not supported at this point.

I was able to launch Dynamics GP and went through running reports, entering and posting transactions, and running SmartLists and nothing seemed out of place and I did not get any error messages.

I want to be extremely clear that this configuration is not yet supported by the Microsoft Dynamics GP Support Team. It's also worth reminding you that the Azure SQL Managed Instance is still in preview mode, which means things may change (for the better for sure) that could still impact this limited test I did. So, if you are going to perform a migration, do so to test as much as you can, but please do not use as a production environment.

I will follow up this series with a video blog just summarizing all this experience and where I believe the Microsoft Dynamics GP development team can make a couple adaptations to make this all work for Dynamics GP customers and partners.

Until next post,

Mariano Gomez, MVP

Monday, April 30, 2018

Deploying Microsoft Dynamics GP on an Azure SQL Managed Instance - Part 2/3

In Part 1 of this series, I walked through the process and experience of signing up for the Azure SQL Managed Instance preview and the deployment process once approved. I also covered some of the issues I ran into.

In this installment, I will take a look at deploying Microsoft Dynamics GP and, again, looking at the process and experiences I encountered while doing so.

Pre-Microsoft Dynamics GP Deployment
One of the steps within the deployment of a Managed Instance is the provisioning of a VM to be able to connect to the service. The instructions call for downloading and installing Microsoft SQL Server Management Studio on the VM. I figured, this is the VM I would also be using to install Dynamics GP, IIS, and Web Client.

NOTE: Currently, you cannot connect to a Managed Instance from outside of the Azure vNet infrastructure, and frankly it makes a lot of sense.

A few things I noticed right off the bat after connecting to the Managed Instance:

a) A Managed Instance is configured to support Mixed Mode Authentication. This is good news for Microsoft Dynamics GP as it is been a long established requirement.

b) The SQL Server system administrator (sa) user is disabled, but exists, unlike Azure SQL. When creating the Managed Instance, you are required to set up a separate SQL administrative account (more on this later).

c) You cannot use SQL Server Configuration Manager to make any changes to the deployed configuration, hence the "Managed" in Managed Instance.

I also downloaded Dynamics GP 2018 from PartnerSource, along with the January Hotfix. In addition, I added .NET Framework 3.5 (using Server Administrator) as it is a prerequisite needed by the Dynamics GP Bootstrap setup program.

Deploying Microsoft Dynamics GP
After running the setup and laying down the Microsoft Dynamics GP program files, I proceeded to launch Dynamics GP Utilities to create the system database, configure the account framework, and add the sample company database, Fabrikam. I tried login in with the SQL administrative account I created during the provisioning of the Managed Instance, and referenced herein above in point (b) only to get the following familiar error:

Login failed error

I ran through the tricks book on this one to no avail and could not get past this problem. Next, I proceeded to enable the SQL Server sa account and was able to log into Dynamics GP Utilities without any issues.

NOTE: The admin account I created during the Managed Instance deployment worked just fine when the Dynamics GP installer asked for the server and account credentials to create the ODBC driver.

After selecting the Advanced configuration in Dynamics GP Utilities, the first screen of interest was the Database Setup window for the system database:

Database Setup

Note the location of both the data and log files.

Continuing with the instance configuration, I got to the Confirmation window without any issues and with everything looking good. After clicking Finish, I immediately got the following error:

The following SQL statement produced an error:
create database [DYNTEST] ON  (NAME = 'GPSDYNTESTDat.mdf', FILENAME = 'C:\WFRoot\DB.0\Fabric\work\Applications\Worker.CL_App13\work\data\GPSDYNTESTDat.mdf', SIZE = 50, FILEGROWTH = 20% )  LOG ON (NAME = 'GPSDYNTESTLog.ldf', FILENAME = 'C:\WFRoot\DB.0\Fabric\work\Applications\Worker.CL_App13\work\data\GPSDYNTESTLog.ldf', SIZE = 20, FILEGROWTH = 25% )

On the surface, there's nothing wrong with this statement as it is very known to Microsoft Dynamics GP professionals. However, I decided to see what exactly would be the problem, so I fired up SQL Server Management Studio and ran the same statement and, voila! The errors was now very clear:

Msg 41918, Level 16, State 1, Line 1
Specifying files and filegroups in CREATE statement is not supported on SQL Database Managed Instance.

Thinking through this, a Managed Instance stores its databases on a Azure storage container. If Dynamics GP Utilities' Database Setup window had a radio button option for On-premise vs Azure, then the T-SQL CREATE DATABASE statement could be tailored to ignore the data file and log options. In fact, this statement worked just fine when I executed it in SSMS:

create database [DYNTEST]

Once again, not all was lost because, while I could not get Dynamics GP Utilities to create the system and company databases due to how the CREATE DATABASE is constructed, I was certain of another method that would work. Find out in my last installment of this series what I did to get things going -- yes, this story has a happy ending 😊.

Until next post,

Mariano Gomez, MVP

Friday, April 27, 2018

Deploying Microsoft Dynamics GP on an Azure SQL Managed Instance - Part 1/3

A few weeks aback, I went through a series of video articles explaining the limitations preventing you from deploying Microsoft Dynamics GP on Azure SQL. See my video blog series here:

Microsoft Dynamics GP: Running System and Company Databases on Azure SQL - Part 1
Microsoft Dynamics GP: Running System and Company Databases on Azure SQL - Part 2
Microsoft Dynamics GP: Running System and Company Databases on Azure SQL - Part 3

A few days later, my friend and fellow MVP Steve Endow introduced me to a new Azure preview feature called Azure SQL Managed Instance. Being a preview feature, there was a lengthy application process in order to gain access to it.

According to Microsoft, Azure SQL Managed Instance (further referred to herein as "Managed Instance" for simplicity sake) delivers the full capabilities of Microsoft SQL Server running on Azure service infrastructure. This is, all the limitations currently imposed by Azure SQL are, technically speaking, removed from a Managed Instance. The following diagram describes the key features.

Source: Microsoft
At this point, it is probably worth pointing out that prior to the introduction of Managed Instances, they were only two methods of deploying SQL Server in the Azure cloud computing platform:

IaaS: under this deployment model, a Windows Server virtual machine (VM) is provisioned with a license of Microsoft SQL Server. As a result, it is up to you to maintain the VM, applying all applicable service packs, back up and securing your databases, and worry about all the related database maintenance procedures. In essence, server administrative operations are no different than those performed on an on-premise deployment of SQL Server.

PaaS: this deployment model allows you to provision Azure SQL as a service, but presents a wide range of limitations for databases running in a traditional Microsoft SQL Server environment. One such limitation is the ability to run cross-database queries without the complexities introduced by Azure SQL (elastic pools, etc.), which is vital to support the system and company database architectural design of Microsoft Dynamics GP. In addition, Azure SQL features extensive deprecation of traditional system objects and stored procedure calls currently supported by traditional Microsoft SQL Server.

The Managed Instance Deployment Process/Experience
Deploying an Azure SQL Managed Instance (in preview mode) is currently an exercise in patience. I began by following the instructions in detailed in this Microsoft Docs article:

Create an Azure SQL Database Managed Instance in the Azure portal

The steps can be summarized as follow:

✔ Whitelist your subscription
✔ Configure a virtual network (VNet)
✔ Create new route table and a route
✔ Apply the route table to the Managed Instance subnet
✔ Create a Managed Instance
✔ Create a new subnet in the VNet for a virtual machine
✔ Create a virtual machine in the new subnet in the VNet
✔ Connect to virtual machine
✔ Install SSMS and connect to the Managed Instance

If I had to do this all over again, I would first wait until acceptance into the preview program has been confirmed, before going through the above steps. It's also worth noting that the deployment of the Managed Instance, in my particular case, took a bit over 27 hours. In the process, I ran into the following issues:

► At first, I did not receive a notification of acceptance (or rejection) within the program. This was easily resolved by opening an Azure subscription support case.

► When I initially provisioned my vNet, I created a single subnet spanning the full range of IPs afforded by the address space. Not being a network engineer, I am sure that this is something a seasoned professional wouldn't have done. It is recommended that subnets be smaller partitions within the address space. In any case, I initially assigned the Managed Instance to the only subnet I had and this caused the deployment to fail.

As it turns out, a Managed Instance requires its own subnet. Since my Azure environment is a sandbox, I had no problem in removing ALL resources and recreating them with the recommendations in the article. At the same time, I created specific resource groups for VMs, network resources, and storage resources, and furthermore set up a subnet for my VMs, different from the Managed Instance subnet -- in short, much, much cleaner! This is the result:

Resource Groups


► Once I got this all going, the deployment process began. Somewhere pass the 27 hour mark I decided to check the provisioning status and found this error message, "Cannot process request. Not enough resources to process request. Please retry you (sic) request later.".

It took a few email exchanges with the SQL Cloud Lifter Ops team -- yes, there's such thing -- and I was up and running and ready to connect to my instance. For the record, I had over 3 teams helping me with this.

When it's all said and done, you will end up with the following resources:

1. A vNet
2. A subnet for the Managed Instance, and for good measure a subnet for the VM
3. A VM
4. A route table associated to the Managed Instance subnet.
5. A storage account that stores the deployment diagnostics for the Managed Instance.

In the next installment, I will go through the deployment steps for Microsoft Dynamics GP. There is a lot to cover, so I will give you time to digest the content of this first installment.

Until next post,

Mariano Gomez, MVP