Category Archives: Azure and Office 365
Using Shared Mailbox in Office 365
Now often, you want to have a common mail address for everyone within a team to monitor and interact through like info@domain.com or support@domain.com Office 365 provides this capability with something called as Shared Mailbox. Features of Shared Mailbox Shared Mailbox doesn’t need an Exchange license. Shared Mailbox doesn’t have its own credentials. Users add this mailbox to theirs and use their own credentials to access it. Shared Calendar is available in a Shared Mailbox where everyone can see who is available when Setting up Shared Mailbox You’ll need to be an administrator in Office 365 to be able to create a Shared Mailbox. Navigate to Office 365 Admin Center and find Shared Mailboxes options under Groups. Click on Add a mailbox I’ll call it Sales@domain.com, for example. And click Add. I selected both the users seen in above step to add to the Shared Mailbox. Those members are seen on the detail pane of the selected Shared Mailbox as shown below Shared mailbox gets created within moments! Adding Users to the Shared Mailbox Only users who have an Exchange Online license can be added to Shared Mailboxes. Click on the mailbox and then on Edit in Members area to add O365 users to the mailbox as shown below Click on +Add Members to add users to the mailbox. You’ll find all the members who already have an Exchange Online license are eligible for adding to the shared mailbox. I selected both the users seen in above step to add to the Shared Mailbox. Those members are seen on the detail pane of the selected Shared Mailbox as shown below Adding Shared Mailbox to Outlook I will show the OWA example in this blog to show how to add the shared mailbox to the user’s Outlook Let’s assume we have the mailbox pwagh@cft79.onmicrosoft.com and we want to add the shared mailbox sales@cft79.onmicrosoft.com to pwagh’s mailbox. In OWA, right click on the root folder of the mailbox and click on Add shared folder Start typing the name of the Shared Mailbox and it should auto-populate the same for you. Select the Shared Mailbox and click Add. The mailbox should then appear in your OWA. Note: It takes a few minutes until the Shared Mailbox is accessible from your mailbox after adding it Hope this was helpful.
Share Story :
Email Encryption in Office 365
Overview: O365 Message Encryption is a service based on Microsoft Azure Rights Management (Azure RMS). Once an RMS is setup, Email messages can be encrypted under certain rules set and provide the recipients with 2 options to read the encrypted email – By an OPT By signing into organization account. Pre-Requisites: Activate Azure RMS in Office 365. Setup Azure Rights Management for Exchange Online Setup transport rule to enforce message encryption in Exchange Online. Activate Azure Rights Management in Office 365: Following are the steps to enable Email Encryption. I’m going to enable encryption on one of my trial environments- Log in to Office 365 Admin Center as a Global Administrator Navigate to Settings section and then select Services and add-ins Then, look for Microsoft Azure Information Protection Open the same by clicking on the highlighted link as shown below On the rights management page, you’ll see the rights management is not activated and you’ll get an option to activate the same. Once you activate the same, it will be activated and you’ll see a page like this Here, Rights Management has been activated! Setup Azure Rights Management for Office 365 Email Encryption: Following steps are carried to setup Azure RMS for Email Message Encryption. Enter the following steps to authenticate and connect to the session. As shown above, enter the commandsSet-ExecutionPolicy RemoteSignedEnter Y/y when asked about changing the Execution Policy.Then, enter $cred = Get-CredentialThen, enter the admin credentials to your O365. 2. You’ll be authenticated, then enter the following commands$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $cred -Authentication Basic -AllowRedirection 3. Then, Import-PSSession $Session as shown below 4. Next step is to verify that IRM is not configured yet. Get-IRMConfiguration 5. Now, configure with key-sharing location. For my North America environment, I’ll use the following –Set-IRMConfiguration -RMSOnlineKeySharingLocation https://sp-rms.na.aadrm.com/TenantManagement/ServicePartner.svc Here’s the list of key sharing locations depending where your tenant resides Location RMS key sharing location North America https://sp-rms.na.aadrm.com/TenantManagement/ServicePartner.svc European Union https://sp-rms.eu.aadrm.com/TenantManagement/ServicePartner.svc Asia https://sp-rms.ap.aadrm.com/TenantManagement/ServicePartner.svc South America https://sp-rms.sa.aadrm.com/TenantManagement/ServicePartner.svc Office 365 for Government https://sp-rms.govus.aadrm.com/TenantManagement/ServicePartner.svc1 6. Import TPD i.e. Trusted Publishing Domain from RMS Online Import-RMSTrustedPublishingDomain -RMSOnline -name “RMS Online” 7. Now, test the successful setup of IRM in Exchange Online Test-IRMConfiguration -sender crmadmin@cft77.onmicrosoft.com (Enter your Admin username) 8. Enable InternalLicensing and test again Set-IRMConfiguration -InternalLicensingEnabled: $true And you’ll get the passed result. 9. Few more steps – Disable IRM templates in OWA and Outlook Set-IRMConfiguration -ClientAccessServerEnabled $false and Enable IRM for O365 Message Encryption Set-IRMConfiguration -InternalLicensingEnabled $true 10. Now, check the IRM Configuration Get-IRMConfiguration IRM is now setup! Configure Rules in Exchange Admin Center: Now, we will setup a very simple rule which where the Exchange will send out an encrypted email Navigate to Exchange Admin Center in O365 Under Mail Flow section, create the below rule And set the conditions as – If the sender is CRM Admin, encrypt the email. And then save. And try sending a sample email – The email will be received like this Download the HTML file and open the same. The HTML file will have the following options – Let’s say, I select OPT, I’ll get another email as this And I enter that OTP, I can then see the message And you have the encrypted message feature as shown above! Hope this was helpful!
Share Story :
Azure Function
Introduction: Sometime it happens that you want to write small piece of code and it should be accessible from outside but you may not have infrastructure ready for it. In that situation you can write a function which can be called from anywhere. Azure function provides required infrastructure for code you need to write and you can make it available within few minutes. You can use Azure functions where you code does not include any complex logic. Azure functions can be used for very small pieces of code which can be invoked via any triggered events. You can choose your language to write the code like C#, F#, Node.js, Python or PHP. Azure Functions lets you develop server less applications on Microsoft Azure. Description: 1. Features: Azure function provides the below features: Choice of language – We can write functions using C#, F#, Node.js, Python, PHP, batch, bash, or any executable. Pay-per-use pricing model – Pay only for the time spent running your code. Bring your own dependencies – we can include other library as well if needed. Integrated security – Protect HTTP-triggered functions with OAuth providers such as Azure Active Directory, Facebook, Google, Twitter, and Microsoft Account. Simplified integration – Easily leverage Azure services and software-as-a-service (SaaS) offerings. Flexible development – Code your functions right in the portal or set up continuous integration and deploy your code through GitHub, Visual Studio Team Services. 2. Function Task: Functions provides templates to get you started with key scenarios, Blob Trigger EventHubTrigger Generic webhook GitHub webhook HTTPTrigger QueueTrigger ServiceBusQueueTrigger ServiceBusTopicTrigger TimerTrigger Create your first function: Prerequisites: Windows Azure Subscription – you can also subscribe for a free trial of Windows Azure from URL. Add Function Apps In order to host your code, you must have a function app created in the Azure. Login to the azure portal and click on the +(plus) sign Select the function app and provide the required details Provide the Azure function details as shown below. Provide all the required field value and click create, you will able to see below screen, Create function: Click on the plus sign as shown Select the httptriger-csharp Provide a unique name to the function and click create Now you are read with the function that can be access from anywhere. This is the template you can write your own code, we will work with the sample code that generated automatically. Let’s test the function from outside. Copy the url form the console as show You will get the code url that can be called from any API tester application, lets we called it from the Postman application. As you can see in the below screen, I called the function from Postman and in out window you will find that it is showing message. It shows some validation message because I have not provided the required fields. I provided the name you will find that output shows the name with greeting. View the function logs Conclusion: So, it concludes that we can have small piece of code that can be called form outside and we don’t require to maintain the infrastructure.
Share Story :
Using D365 App for Outlook for quick Lead capture
The purpose of this article is to help fellow entrepreneurs and sales managers leverage the power of Office 365 and Dynamics 365 to manage and build their sales pipeline with a few easy steps. I use the Outlook Web Application (OWA) for my O365 email access. I also have CRM open in another tab and usually toggle between CRM and email. There are several email conversations that happen with existing customers or partners that I would like to quickly track as a Lead in CRM. I might not know the timeline or even the budget, but I know since it is an existing customer/partner, this would be a good lead. The disadvantage of not tracking these potential opportunities is that after a while you tend to forget to follow up! In this article, we focus on leveraging the D365 App for Outlook to convert emails into Leads which then feeds into our Sales Pipeline. Step 1 – Hit that “D” hard D365 now has a cool new logo (not a logo, may be an icon?). Anyways, once you install D365 App for Outlook, you will see this logo next to any email you have received. Below is an email I received this morning from our partner, on a potential opportunity – Step 2 – Track the Email and Create a Lead! Once you ‘hit the D’. you get to this window, where you would ‘Track’ that email. I already have Andy Neal as a contact in my system, so the app gets me all that info right in my email window! Once you track the email, you will get an option to set the regarding option. On this screen, select New and select Lead – Finally, enter the details for your Lead and close the window or open that Lead right from your email! Step 3 – Just do it. Yes, this step is same as in my previous article. Get in the habit of doing this and you will see a good lead pipeline that you can work through daily and increase your conversion rates. Remember – ‘Sales cures all.’ Let’s take care of that sales pipeline! You can always email me at AShah@CloudFronts.com to discuss your sales processes and technology adoption. In the coming articles, I will continue to focus on efficient ways to build and manage your sales pipeline and how this ties into one of the most important KPIs for running your professional services business.
Share Story :
Configuring Azure AD B2C: Sign up and sign in for consumers in your applications on Azure
Azure Active Directory B2C is a cloud identity management solution for your consumer-facing web and mobile applications. It is a highly available global service that scales to hundreds of millions of consumer’s identities. Built on an enterprise-grade secure platform, Azure Active Directory B2C keeps your applications, your business, and your consumers protected. Azure Active Directory B2C offers developers a better way to integrate consumer identity management into their applications with the help of a secure, standards-based platform and a rich set of extensible policies. When you use Azure Active Directory B2C, your consumers can sign up for your applications by using their existing social accounts (Facebook, Google, Amazon, LinkedIn) or by creating new credentials with username and password and called as “local accounts.” Get started To build an application that accepts consumer sign up and sign in, you’ll first need to register the application with an Azure Active Directory B2C tenant. Step 1: Sign into Azure subscription and get access to Azure AD B2C. Step 2: Create an Azure AD B2C tenant Use the following steps to create a new Azure AD B2C tenant. Currently B2C features can’t be turned on in your existing tenants. Sign in to theAzure portal as the Administrator. Click New > App Services > Active Directory > Directory > Custom Create. Choose the Name, Domain Name and Country or Region for your tenant. B2C directories are not yet available in the selected country/region so select region or country B2C is available. Check the option that says This is a B2C directory. Complete. Your tenant is now created and will appear in the Active Directory extension. You are also made a Global Administrator of the tenant. You can add other Global Administrators as required. Step 3: Navigate to the B2C features blade on the Azure portal Navigate to the Active Directory extension on the navigation bar on the left side. Find your tenant under theDirectory tab and click it. Click theConfigure tab. Click theManage B2C settings link in the B2C administration section. The Azure portal with the B2C features blade showing will open in a new browser tab or window. Note: It can take up to 2-3 minutes for your tenant to be accessible on the Azure portal. Retrying these steps after some time will fix this. Easy access to the B2C features blade on the Azure portal Pin this blade to your Starboard for easy access. Sign into the Azure portal as the Global Administrator of your B2C tenant. If you are already signed into a different tenant, switch tenants (on the top-right corner). Click Browse on the left-hand navigation. Click Azure AD B2C to access the B2C features blade. Azure AD B2C to access the B2C features blade – How to add application in Azure AD B2C After adding application, you need to share application ID with developing team for further coding to redirect to sign up and sign in page. This is ‘renaissancesvcb2c.onmicrosoft.com’ your tenant ID and ‘https://www.contoso.com’URL will be required for configuring with Identity providers to sign up and sign in. After configuring your tenant ID and URL with Identity providers it will provide Client and secret ID. Add Identity Provider Use that ID and Key in Azure AD and try to Sign up and Sign in. After Adding Identity providers. Next step – Add sign up polices as per your requirement. Adding sign in policies is easier then sign up policies.
Share Story :
Introduction to Azure Event Hubs
Overview Microsoft Azure Event Hubs is a managed platform service that can intake a large amounts of data for various scenarios. It is a highly scalable, low latency and highly-scalable data ingest system. Data is ingested here in the form of events. Event Publishers submit data to the Event Hub and Event Consumers consume the data at their own time. Some scenarios where Event Hubs are applicable are – Application Instrumentation, user experience, Internet of Things (IoT) Event Hubs reside in the Service Bus namespace. Event Hubs uses AMQP and HTTP as its primary API interfaces. Below diagram will give a high level overview of where Event Hubs lie: Partitions in Event Hubs Partitions are ordered sequence of events that reside in the event hubs. Newer partitions are added to the end of the queue as they arrive. Partitions retain data for a configured period of time. This setting is common across all partitions in the Event Hub. Every partitioned is populated at their own pace and not necessarily sequentially. Hence, data in partitions grow independently. of partitions are specified while creation of Event Hubs. This number should be between 2 and 32. Default partitions allotted are 4. The number of partitions you choose are more related to the number of concurrent consuming applications you expect to have. This partition count cannot be changed once the event hub is created. Event Publishers So who are event publishers? The entities that populates data to the event hubs are the event publishers. Event publishers can publish data to the event hubs either using HTTPS or AMQP 1.0. Event publishers use a SAS (Shared Access Key) token to authenticate themselves to Event Hubs. Common Tasks for a Publisher Acquire an SAS Token SAS is the authentication mechanism for Event Hubs. Service Bus provides SAS policies at the namespace level and at the Event Hub level. Service Bus can regenerate the key and authenticate the sender. Publishing an Event Service Bus provides an EventHubClient class for publishing events to an Event Hub from .NET clients. Events can either be published individually or batched. A single publication has a limitation of 256 KB, whether batch or individually. Publishing events larger than this will result in error. Partition Key It is a value to map incoming messages to the specific partitions for data organization purpose. This is a sender supplied value passed to the event hub. It is processed through a hashing function which creates the partition assignment. Partition Keys are important for organizing data for downstream processing. Below Diagram explains how Partition Key work: Event Consumers An entity that reads event data from Event Hub is an event consumer. All event consumers read from the partitions in the consumer group. Each partition should only have one active reader at a time. Consumer Groups A consumer groups is a view (state, position, or offset) of the entire Event Hub. Consumers groups lets consuming applications have a separate view of the entire Event Hub. There is always a default consumer group. You can create up to 20 consumer groups in an Event Hub. Stream Offsets & Checkpointing An offset is a position of an event within a partition. It is a client-marker to specify at which point should the processing should happen from. Consumers should store their own offsets. Checkpointing is a process where readers mark their position in a partition in the event hubs. Common Consumer Tasks All consumers connect to the Event Hub via AMQP 1.0. It is a session and state-aware bidirectional communication channel. As a partitioned consumer model, only 1 consumer can be active on a partition at a time within a consumer group. The following data is read from the Event Hub Offset Sequence Number Body User Properties System Properties. As mentioned above, it is user’s responsibility to maintain this offset. So now you know about Event Hubs! Summary Azure Event Hubs provide high-scalable, telemetry processing service that can be used for common applications. Event Hubs provide low latency. In the next part of the blog, I’ll be covering a technical look at the Event Hubs wherein Dynamics CRM will publish data to the Event Hubs and how data is available for applications to consume! Watch out here for the upcoming blog very soon! Hope this overview was helpful.
Share Story :
Data Movement using Azure Data Factory
Prerequisite: Azure Subscription, SQL Server Management Studio (SSMS), Azure Explorer What is Azure Data Factory? Data Factory is a cloud-based data integration service that orchestrates and automates the movement and transformation of data. Data Factory works across on-premises and cloud data sources and SaaS to ingest, prepare, transform, analyze, and publish your data. You can use Data Factory anytime you need to collect data of different shapes and sizes, transform it, and publish it to extract deep insights all on a reliable schedule. Key Concepts in Azure Data Factory Dataset – Identify data structures within different data stores including tables, files, folders, and documents Linked Service – Define the information needed for Data Factory to connect to external resources Pipeline – Used to group activities into a unit that together perform a task Activity – Define the actions to perform on your data Read more about Azure Data Factory here In below example, we will demonstrate copy data activity from csv file stored in Azure Blob Storage to Azure SQL Database using Azure Data Factory Editor. Steps for Data Movement using Azure Data Factory: Step 1: Create Storage account and a container in Azure. Place file containing data into the container using Azure Explorer or similar tool Step 2: Below image shows csv file content and same placed in Azure container using Azure Explorer Step 3: Create an Azure SQL Database to store output data Step 4: By connecting SSMS to Azure SQL Database, we can create output table in Azure SQL Database Step 5: Now go to new Azure Portal i.e. portal.azure.com and create a new Data Factory as shown Step 6: We need to create 3 things to start data movement. Linked Services, Datasets and Pipeline. You can start creating by opening Azure Data Factory and click on “Author and deploy” Step 7: First create linked service for Azure SQL Database and then for Azure Blob Storage Find the JSON code for linked service given below: { “name”: “AzureSqlLinkedService”, “properties”: { “description”: “”, “hubName”: “adfcf_hub”, “type”: “AzureSqlDatabase”, “typeProperties”: { “connectionString”:”Data Source=tcp:qbozi5org6.database.windows.net,1433;Initial Catalog=adfcfs;Integrated Security=False;User ID=cfadmin@qbozi5org6;Password=**********;Connect Timeout=30;Encrypt=True” } } } For Azure Blob Storage: { “name”: “StorageLinkedService”, “properties”: { “description”: “”, “hubName”: “adfcf_hub”, “type”: “AzureStorage”, “typeProperties”: { “connectionString”: “DefaultEndpointsProtocol=https;AccountName=adfcfsstorage;AccountKey=**********” } } } Step 8: Now create datasets for source as well sink For Azure SQL Database { “name”: “OpportunitySQLTable”, “properties”: { “structure”: [ { “name”: “OpportunityName”, “type”: “String” }, { “name”: “Status”, “type”: “String” }, { “name”: “EstimatedRevenue”, “type”: “String” }, { “name”: “ContactPerson”, “type”: “String” } ], “published”: false, “type”: “AzureSqlTable”, “linkedServiceName”: “AzureSqlLinkedService”, “typeProperties”: { “tableName”: “Opportunity” }, “availability”: { “frequency”: “Hour”, “interval”: 1 } } } For Azure Blob Storage { “name”: “OpportunityTableFromBlob”, “properties”: { “structure”: [ { “name”: “OpportunityName”, “type”: “String” }, { “name”: “Status”, “type”: “String” }, { “name”: “EstimatedRevenue”, “type”: “String” }, { “name”: “ContactPerson”, “type”: “String” } ], “published”: false, “type”: “AzureBlob”, “linkedServiceName”: “StorageLinkedService”, “typeProperties”: { “fileName”: “Opportunity.csv”, “folderPath”: “adfcontainer/”, “format”: { “type”: “TextFormat”, “columnDelimiter”: “,” } }, “availability”: { “frequency”: “Hour”, “interval”: 1 }, “external”: true, “policy”: {} } } Step 9: Create a pipeline. Find the JSON code below { “name”: “ADFDataCopyPipeline”, “properties”: { “description”: “Copy data from a blob to Azure SQL table”, “activities”: [ { “type”: “Copy”, “typeProperties”: { “source”: { “type”: “BlobSource” }, “sink”: { “type”: “SqlSink”, “writeBatchSize”: 10000, “writeBatchTimeout”: “60.00:00:00” } }, “inputs”: [ { “name”: “OpportunityTableFromBlob” } ], “outputs”: [ { “name”: “OpportunitySQLTable” } ], “policy”: { “timeout”: “01:00:00”, “concurrency”: 1, “executionPriorityOrder”: “NewestFirst” }, “scheduler”: { “frequency”: “Hour”, “interval”: 1 }, “name”: “CopyFromBlobToSQL”, “description”: “Push Regional Effectiveness Campaign data to Azure SQL database” } ], “start”: “2015-11-17T08:00:00Z”, “end”: “2015-11-17T09:00:00Z”, “isPaused”: false, “pipelineMode”: “Scheduled” } } Step 10: Now go back to your Data Factory editor and you can see status of different linked services, datasets and pipeline created Step 11: Click on “Diagram” and check the status of slices scheduled for data movement Step 12: Once in ready status, you can go back to Azure SQL Database and check if data has been copied/moved.
Share Story :
Azure setup using Office 365
In this blog we walk-through how to setup Azure using Office 365. Pre-Requisite Office 365 administrator account. Steps 1. Login to Office 365 portal. Navigate to https://portal.office.com 2. Click on Admin Button 3. Click on Azure AD to setup Azure. This will link your Azure AD to Organization account. Note: Don’t use admin account to setup Azure AD, instate of that you can use client account. Once Azure AD is setup, Account administrator cannot be changed. Fill required details to setup Free Azure Trial Account. Note: Credit card is required for Azure Sign-Up. After sign-up process is completed , navigate to https://manage.windowsazure.com to access Windows Azure.
Share Story :
Creation of ACS and SAS in Azure
ACS is an Azure service that provides an easy way to authenticate users to access web applications and services without having to add complex authentication logic to code. While SAS is used to access resources in storage account which includes both primary and secondary keys. Assumptions Azure Account should be added in PowerShell with respective User’s Credentials. Note: For Adding account In Microsoft Azure PowerShell refer to following link: https://www.cloudfronts.com/azure-console-login-logout-using-azure-powershell/ Steps in Microsoft Azure PowerShell for ACS Step 1: Write ACS Command in PowerShell ACS Key can be created using Azure PowerShell following command, New-AzureSBNamespace GravityDocument -Location “Southeast Asia” -CreateACSNamespace $true -NamespaceType Messaging Command requires Service Bus Namespace Name, Location and Messaging type. Step 2: ACS Information on Azure Portal This ACS Key information can be seen on Microsoft Azure Account with that corresponding service bus namespace provided in the command above. Once the Namespace is created the corresponding Connection Information is available at the bottom under Connection Information. Steps in Microsoft Azure for SAS Here for SAS Key we have created Queue inside the namespace. Step 1: Creation of Queue Now Queue can be created inside this specified Namespace, for that follow the below screenshots Specify the required details i.e. the Queue name under the specified namespace. Step 2: Key with Permissions Now since the queue is created SAS key can also be generated with different permissions like Manage, Listen & Send. So under Configure option, under Shared Access Policies specify the name and permission to be given for that particular queue. Now SAS key for that particular queue can be obtained from Connection Information of Queue with SAS key. Conclusion Thus, we can Create ACS and SAS requests as per requirements using Microsoft Azure PowerShell and Azure Portal.
Share Story :
Azure Console Login & Logout using Azure PowerShell
User can Add Account on Azure to check or get its Subscription details as well as Remove Account. Let’s see the steps to add account and get subscription details also allows user to select a particular subscription as per requirement. Step 1: Run Microsoft Azure PowerShell as Administration. Step 2: Add your Microsoft Azure Account To Login to your account write a command Add-AzureAccount. And then popup comes for asking the username. Note: Login through your Live ID Step 3: Enter Credentials Once the Step 2 is done, it will ask for user’s Credentials. Enter the Username and Password and press Enter. Once the details are entered, all the subscriptions with their ID and tenants related to that particular user will be listed as shown in the figure below. Step 4: Get Subscription Details If the user wants to see the entire details for related Subscriptions present in that particular account, type command –GetAzureSubscription. This command lists downs every Subscription with its corresponding details like Subscription Id, Subscription Name, Account, Storage Account etc. as shown in figure below. Step 5: Select Particular Subscription If user wants to select particular Subscription, type command Select-AzureSubscription (Subscription Name) and then type command Get-AzureSubscription –Current, this command will give the current Subscription details which was selected. Step 6: Remove Microsoft Azure Account Now if User wants to Logout, then type command –RemoveAzureAccount , once that is done PowerShell asks for ID and confirmation for the same. Conclusion: Thus User can login and logout successfully with help of Microsoft Azure PowerShell also can set and get its Subscription Details for that particular Account.