Loading...

Follow European SharePoint Community Resource Centre on Feedspot

Continue with Google
Continue with Facebook
or

Valid

When considering moving to the newest version of SharePoint, there are three areas that need to be considered. We’ll be discussing choosing whether to stay on-prem, move to the cloud, or use a hybrid scenario. We’ll also be diving into the high-level features of SharePoint 2019 On-Premises, and then move on to some key factors that will guide you to the appropriate next step. Let’s get into it!

Choosing to Stay On-Prem or in the Cloud

Know that you’re not alone when choosing the move to a new on-premises, hybrid or cloud solution. And let’s admit it: when it comes to user experience, people don’t like change. With this in mind, there’s been a lot in-between the updates of SharePoint On-Premises that allow you to keep the original user experience the same in backward compatibility mode, which is great for those concerned with having to adjust to a different view!

Considering migrating to SharePoint 2019? This feature breakdown might be useful:CLICK TO TWEET

The next thing that we see in the latest version of SharePoint is a change in administration. The biggest thing with updates in administration is that there are now a bunch of different tools that the platform brings to the table, all of which helps make the admin’s jobs easier.

If you’re considering the cloud, know that the security will continually improve as updates are released. Those using On-Premises have to make an effort to keep up with improvements.

SharePoint 2019 Deprecations

Looking at the chart above, the left column shows deprecated features, which means that they’re not going to put any more effort into them. On the right-hand column you see what features are 100% removed. There are both very important to note when considering what your organization prioritizes!

What’s New with SharePoint 2019 On-Prem?
Cloud Born Release

The first thing that’s coming in SharePoint 2019 that looks almost exactly like Office 365 in this vein is Site Creation. The new release is going to have a modern Site Creation interface that’ll allow users to use the Waffle to navigate to the SharePoint application and create a site themselves through the GUI. This should significantly lessen the load on admins and circumvent the need to click through an overwhelming amount of settings links.

Modern Team Sites

There will also be updates to the modern team Sites. “Modern” is a term to differentiate between the old code and the new code. There are a lot of pieces that remain in place, but there’s also a huge reduction in the amount of code in general. You’ll also notice that this is the same infrastructure as Office 365. The beauty of it is that the re-formatting is actually usable and easy for people as opposed to the older versions (such as 2013 sites, for instance).

Modern Libraries

Modern Libraries are where data and information is stored and organized in lists and libraries. From the library side of things, there are surfacing capabilities that makes it significantly easier for users to get hard-to-find information. You’ve also got the nice modern slider panes that are now built into the libraries.

The other thing I really like about Modern Libraries is the ability to change your view to a tile view or a preview of the documents and images that exist in those libraries. It makes it so much easier than what it used to be in the old SharePoint UI interface. Now it’s a drop-down option that you can quickly access. Along with the updates to the Modern Libraries, there are now a lot more modern capabilities to go along with them too! For example, there are now one-click moves for documents and information.

Modern Pages

Speaking of information, Modern Pages have made their way to SharePoint 2019. Modern Pages essentially use all of these preset zones and make it easier to add text to different web parts and incorporate your own information through those SharePoint framework web parts. It’s become so much easier now!

OneDrive Sync Client

One of my favorite updates is the OneDrive Sync Client. A long time ago we all questioned if we wanted to use the OneDrive Sync Client or the SkyDrive Sync Client, and now it is what’s called the “Next Generation Sync Client,” and it’s being used across On-Premises, the cloud, OneDrive, and SharePoint. It can get a little confusing because it can be used for all SharePoint document libraries, and your data can be On-Premises while still using the OneDrive Sync Client that’s already loaded with your OneDrive.

Modern

The new Modern experience is really catching SharePoint up to what industry standards are for what a web interface should look like. It’s great that they’re bringing this to the On-Premise platform with all these features that we’ve been enjoying in Office 365 for a while now.

Communication Sites

Next up there are updates to Communication Sites. Communication Sites are where all the other features come together. It’s great for creating a site where people congregate and gather information. A great example of would be an HR site.

Mobile

The out-of-the-box phone app for SharePoint works a whole lot better than anything that previously existed, especially with Modern Sites and the new mobile app redesign. It’s now significantly easier to use and see what’s being surfaced.

Workflow Support

When it came to SharePoint 2013 and 2016, there was a lot of capability to connect and build out a hybrid environment. SharePoint 2019 pushes that capability even further. As opposed to having some features that allow you to build out a hybrid model, 2019 almost encourages you to build out that hybrid model.

Part of this is likely due to Microsoft’s push towards cloud adoption. If you want to have and maintain a hybrid environment in the long term, for whatever reason, it’s a lot easier to do with 2019.

Decisions, Decisions

When it comes to migrating from an older version of SharePoint to a newer version, there are many factors that need to be taken into consideration. For example, you may need to answer questions such as:

  • What’s the size and health of the infrastructure?
  • What’s the look and feel from a user experience perspective?
  • What are the associated compliance hurdles and risks?
  • What’s authentication and authorization look like in the environment?
  • How strong is the network?

There’s so much more to consider when thinking about migrating. For a comprehensive evaluation of what a migration project might look like for your organization, check out AvePoint’s hybrid and cloud migration services.

Reference:

AvePoint, (2019). SharePoint 2019 Feature Breakdown: An Essential Overview for New Adopters. Available at:
https://www.avepoint.com/blog/sharepoint-hybrid/sharepoint-2019-overview/ [Accessed: 10th July 2019].

The post SharePoint 2019 Feature Breakdown: An Essential Overview for New Adopter appeared first on European SharePoint, Office 365 & Azure Conference, 2019, Prague, Czech Republic.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Before you can do anything with SharePoint 2019 you have to install SQL Server.  You have a couple of options available as both SQL Server 2016 and 2017 can be used.  I want to go with SQL Server 2017 (all the latest features right?).  So in this post I will be demonstrating how to install and configure SQL Server 2017 for SharePoint 2019.  If you are wondering why you should bother with SharePoint 2019 at all, review my previous post to review a lot of the features you gain and others you lose.  This series doesn’t cover the AD and server configuration.  There are a couple of posts I did when building out SharePoint 2013 that can assist with this.

If you are interested in checking out the other items in this series click on the links below:

Install and Configure SQL Server 2017 for SharePoint 2019

In your VM, mount the SQL Server 2017 image.  While I don’t really need a config or unattend file this configuration I am going to go through the steps anyways in case I need to install SQL Server again somewhere else.  First I am going to copy the necessary binaries to the server.Copy SQL Binaries to ServerPowerShell

12#Assumes ‘E’ is your DVD drive.  Change as requiredCopy-Item E:\ C:\Install\SQLBinaries\ -Recurse
Build the Config File

The first step is to build the config file.  This way you can re-use it elsewhere with PowerShell and the binaries.  Just a quick note: PowerShell could be used entirely here, but I wanted to create a more general method of doing it since everyone has different configurations.  By creating your own config file you can create it to your requirements.

  1. Execute the setup file where you saved the binaries

  1. Click on the Installation tab (left frame) and click on New SQL Server stand-alone installation or add features to an existing installation
  2. Enter the product key for the SQL Server.  Click Next.
  3. Accept the license.  Click Next.
  4. Place a checkmark to allow Microsoft Update to keep the installation up to date.  Click Next.
  5. At the Feature Selection screen select the Database Engine Services option.  Don’t really need anything else for a basic SharePoint 2019 installation.  Change the installation location as needed or leave at defaults.  Click Next.

  1. At the Instance Configuration screen enter in a name for your instance.  You can leave it as the default, but if you ever add more instances, you are going to want to ensure you have meaningful names.  Click Next.

  1. At the Server Configuration screen enter the accounts and passwords you created to run the SQL Agent and Database Engine.  Click Next.

  1. I always ensure the SQL and SP admin accounts are added as SQL Administrators.  I also use mixed mode so I can utilize both domain and SQL accounts for connection and admin.  Also, unless you actually have multiple drives within your VM I don’t usually modify the default directories.  Click Next.
  2. Review the installation settings at the “Ready to Install” screen.
  3. Most importantly, the “Ready to Install” screen contains the location of the config file we want.  If you want to install by command line I suggest you move it into your SQL Binaries folder.

From here you can select Install and let the server install.  I am actually going to finish the installation with PowerShell to show you how to do so.

Install SQL Server 2017

In the script we will run, we make use of the “/Q” option for a silent install.  This interferes with the UIMode setting in the config file.  Open up ConfigurationFile.ini and look for the entry “UIMODE = “Normal”.  Place a semicolon (“;”) in front to comment it out.

To install SQL Server with our required options run the following commands:Install SQL Server 2017 from Command LinePowerShell

cd C:\Install\SQLBinaries
.\SQLBinaries\Setup.exe /ConfigurationFile=.\SQLBinaries\ConfigurationFile.INI /Q /Action=Install /IAcceptSQLServerLicenseTerms /SQLSVCPASSWORD=<password> /AGTSVCPASSWORD=<password> /sapwd=<password>

Once the installation is complete you are ready to move onto the next step… Installing SharePoint 2019!

Thanks for reading!

About the Author:

David is a Senior Manager with Protiviti Canada, and a 3 time Office Apps and Services MVP. Over the last 18 years, David has worked in a wide variety of areas in IT ranging from Desktop\Server support to developing in C++ and .NET technologies. During that period, David has been able to add value to a wide variety of clients in several sectors, including utilities, government, banking, and agriculture. For over the past eight years, David has been working exclusively in the SharePoint and Office 365 domain. He has helped stand up multiple Enterprise SharePoint environments as well as designed and built numerous solutions utilizing PowerApps, Microsoft Flow and SharePoint. He is a knowledgeable and sought-after international speaker within IT circles and consulted regularly by companies formulating their Digital Workplace strategies.”

Reference

Drever, D. (2019). Deploying a SharePoint 2019 Development Environment – Install and Configure SQL Server 2017 for SharePoint 2019. Available at:
http://prairiedeveloper.com/2019/03/install-and-configure-sql-server-2017-for-sharepoint-2019/

The post Deploying a SharePoint 2019 Development Environment – Install and Configure SQL Server 2017 for SharePoint 2019 appeared first on European SharePoint, Office 365 & Azure Conference, 2019, Prague, Czech Republic.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

“My name is Andy and I am a huge fan of source control.” That said, I confess I have struggled with git for a couple years now. I recently had some “play time” – I love play time – and decided to play with git.

In this post I will demonstrate how to create an Azure DevOps account, configure it, create a new Azure DevOps Project, connect my Azure DevOps project to SQL Server Data Tools (SSDT), clone the Azure DevOps project to a local repository, create an SSIS project in the local repository, perform an initial checkin, update the local SSIS project, and check in the changes so they are stored in our Azure DevOps remote repository.

That’s a lot and I know you’re itching to get started walking through this demo. But first I’d like to share why I’m writing about Azure DevOps, Git, and SSIS – and recommend a book while I’m at it.

Why?

I have been listening to Can’t Hurt Me, an audio book by David Goggins and Adam Skolnick based on Goggins’ life. The audio book (and hard copy) contains profanity. I am not a fan of gratuitous profanity – profanity for the sake of profanity. In my opinion, that’s not what I’m hearing when I listen to Can’t Hurt Me.

I’m hearing a man communicate.

Goggins offers solid advice for, well… life. The audio book is better than the hard copy because the co-authors chat in between (and during) chapters about the experiences recorded in the book. (Shameless plug: You can get a free audio book from Audible and help out the Data Driven Podcast if you sign up using this link.)

One piece of advice Goggins offers is: face your challenges and fears. Overcome them. With a tip o’ the cap to Mr. Goggins, let’s “get after this.”

Git is a distributed version control system (VCS) that is free and open source. I can hear some of you thinking, “What does that have to do with SQL Server Integration Services (SSIS), Andy?” I’m glad you asked:

SSIS is software development.Andy, circa 2006

Learning By Example

There are a few moving parts to using SSIS with Azure DevOps. In this post I’m going to share how to configure the parts and then get them moving. Cool?

Setup an Azure DevOps Account

First, you need Azure DevOps. Browse to dev.azure.com and set up an account:

Azure DevOps

Before I dive into something like this, I prefer to count the cost. At the time of this writing Azure DevOps may be used for free:

Select a plan

See the Azure DevOps Pricing page for details (note: this may have changed since I wrote this post).

Once you have Azure DevOps up and running, create a project:

EntDNA

Clicking the Create project button opens the Create New Project window. Configure the project by selecting a version control engine and a work item process :

Create new project

Please note TFS is an option for version control engine. Are you using TFS on-premises but thinking of migrating to the cloud? Here you go.

Click the Create button to proceed. In a minute or so, the Overview>Summary page for the project displays:

SSIS Test
Connect to the Azure DevOps Project

The next step is to connect SSDT to the Azure DevOps project. Begin by clicking Team>Manage Connections:

Microsoft Visual Studio

Note: You may need to disconnect a current connection first:

Disconnect from Server

You may need to add an account to Visual Studio’s connections:

Connect to a Project

If so, you are prompted for credentials from Azure:

Visual Studio

Once connected to Azure DevOps via Azure security, you may select an account:

Connect to a Project

After connecting, you may select the project you created earlier:

Connect to a Project

Git works in a local repository which exists on the developer’s machine. The Clone button surfaces three options:

  1. Connect
  2. Clone
  3. Clone with Submodules

The default option for the highlighted button is “Clone.” That’s a clue, cloning is our next step but for our introductory demonstration, we select Connect:

Clone Options

Note we cannot continue unless we clone the repository (that’s why it was shown by default!):

Clone the repository
WHAT JUST HAPPENED?

When we cloned the SSIS Test Azure DevOps git repository, we created a local, linked copy of the SSIS Test Azure DevOps git repository in the specified folder – in my case, the local folder was C:\Users\A. Ray Leonard\source\repos\SSIS Test:


.git

The next step is to “Create a new project or solution” in this new local repository (it still has that new repository smell!):T

Team Explorer
Create an SSIS Project

Click the”Create a new project or solution” link in Team Explorer to create a project in our new local repository:

New Project

View Solution Explorer once the new project has been created. Note the small green “+” sign decorating the solution name:

Solution Explorer

The “+” sign indicates the solution has been added to source control, which – again – we may view the solution in our local repository using Windows Explorer:

.git

But when we check our Azure DevOps SSIS Test project, we see a message indicating the remote repository is empty:

“Dude! Where’s my code?”
Committing Code (Locally)

Click View>Team Explorer to open the Team Explorer dialog:

Team Explorer

Once Team Explorer opens, click the Changes button:

“Ch-ch-ch-changes”

Enter a Commit Message:

Q: “How committed are you to this relationship?”

Click the Commit All button to commit the changes locally:

Team Explorer – Changes

What does locally committed mean? Your local repository has been updated but no changes have been transmitted to your remote repository…yet.

As the message above reads, “Sync to share your changes with the server.” Click the “Sync” link (hey, that rhymes!):

Syncing

Clicking the Sync link triggers a git “push”

Team Explorer – Synchronization


Team Explorer lets you know once the sync and push operations complete:


Team Explorer – Synchronization

Now you can see files in your remote repository – up in Azure DevOps, finally!

There be files here. WOO HOO!

You can learn a lot more about interacting with your local and remote repository from the Pro Git book, available online for free:

git
Update, Commit, Sync (Push)

We now have a copy of our mostly-empty SSIS project stored in the cloud. w00t! Let’s make some changes to the project and push them to Azure DevOps.

I opt to rename the package:

AzureDevOpsTest

I opt to add a Script Task for ETL Instrumentation as recently posted, SSIS Design Pattern: Use Script Tasks for ETL Instrumentation:

SCR Log Package Start
I Confess, This (Still) Confuses Me…

After a save, I add a message and commit the changes locally:

This is not the confusing part…

Once the changes are committed locally, I click the Sync button as before:

Still not the confusing part…

After clicking Sync, I need to Push the changes to the Azure DevOps remote repository:

THIS is the confusing part.

This confuses me. Why doesn’t Sync issue a git push – like with the initial sync? 
I’d be ok with the initial git sync not pushing, even; I crave consistency.

Regardless, my updates now reside in the Azure DevOps remote repository:

Azure DevOps
Conclusion

In this post we configured Azure DevOps, added an Azure DevOps Project, connected the Azure DevOps project to SQL Server Data Tools (SSDT), cloned the Azure DevOps project locally, added an SSIS project to the local repository, performed an initial checkin, updated the local SSIS project, and checked in the changes.

Mission accomplished.

:{>

About the Author:

Andy Leonard is the founder and Chief Data Engineer at Enterprise Data & Analytics, an SSIS trainer, consultant, and developer; a Biml developer and BimlHero; SQL Server database and data warehouse developer, community mentor, engineer, and farmer.

Reference:

Leonard, A. (2019). Azure DevOps, SSIS, and Git Part 0 – Getting Started. Available at:
https://andyleonard.blog/2019/03/azure-devops-ssis-and-git-part-0-getting-started/ [Accessed: 5th June 2019]

The post Azure DevOps, SSIS, and Git Part 0 – Getting Started appeared first on European SharePoint, Office 365 & Azure Conference, 2019, Prague, Czech Republic.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Creating a company-wide Office 365 email signature isn’talways an easy process and it comes with limitations. To create a full HTML email signature using the in-built Office 365 disclaimer feature, you’ll need to have a decent knowledge of Transport Rules and HTML coding.

The in-built Office 365 disclaimer editor within the admin portal is not designed to deploy full-HTML email signatures; it is designed for plain text disclaimers, which means there are complications and limitations when trying to deploy branded email signatures for your users.

What are the limitations of manually setting up Office 365 signatures?

Creating email signatures for all Office 365 users can be done within the mail flow settings of the Office 365 admin portal. You have to create a rule to append a disclaimer to the sender’s email; if you’d like to deploy a full HTML signature you will have to paste in HTML code when specifying the disclaimer text.

If all you wanted was a plain text disclaimer on the bottom of an email thread, creating mail rules within the admin portal is all you will need. However, for full HTML Office 365 signatures on every email you send, your options are quite limited:• Users won’t get email signatures on reply emails – only one signature will show at the bottom of an email conversation.• You cannot test email signatures before deploying to all users.• You cannot embed images such as logos or promotional banners directly into the signature.• Email signatures will not be applied to emails sent from mobile devices and Macs.• You cannot schedule email signatures based on time or date.• You’ll need to know HTML or know someone who does.• Users cannot see their email signature in their Sent Items folder.

In order to append HTML signatures on all email sends, including replies, you will either need to write PowerShell scripts, which can prove to be extremely time consuming, or you can use a third-party solution such as Exclaimer Cloud – Signatures for Office 365.

How can Exclaimer help?

This is where a third-party Office 365 signature management solution can help. Exclaimer Cloud – Signatures for Office 365 is the premier cloud service for centrally managing Office 365 email signatures. The solution allows you to create and manage multiple email signatures for every Office 365 user in your organization, without the hassle of creating rules or commands.

Exclaimer Cloud adds signatures to all sent emails via Microsoft Azure. That means signatures are added to email sent from any device, including smartphones and tablets. The service also allows for easy management of specific email signature elements including social media icons, promotional banners and legal disclaimers from one intuitive web portal.

What’s more, you can even delegate the responsibility of managing email signatures to someone else, such as the marketing department, and whoever does create signatures doesn’t have to be skilled in HTML.

The solution has many advantages over manually implementing Office 365 email signatures, including:• Email signatures added to emails sent from any mail client or device• Adds the signature under the most recent reply• Apply a specific signature based on the recipient• See signature while composing an email in Outlook• And much more!

Find out more by visiting https://www.exclaimer.com/exclaimer-cloud/signatures-for-office-365

The post The Difficulty of Office 365 Email Signature Management appeared first on European SharePoint, Office 365 & Azure Conference, 2019, Prague, Czech Republic.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Development of software has continuously become complicated though the ease of development has increased with the improved tools. The world has already shifted towards Microservices based architectures. There will no longer be large scale web applications which will crush a lot of functionality in.

Microservices are small in scope and independent. They can be built and tested separately as they are built to be independent in terms of functionality as well as other Microservices that provide supportive functionality around. A Microservice is built to solve a particular problem, or functionality. They are intended to deploy rapidly and frequently based on the need. Given they are smaller in size, it can come at a cost of ending up with multiple projects or components for development.

What is Azure Dev Spaces?

Imagine a developer having to connect 10 – 15 Microservices together to get some functionality done? There will be risks with possible environment-based issues as well. The best way to counter these within a team would be to host them on cloud. That is what exactly Azure Dev Spaces does. If you are a part of a team that work on a multi Microservice based solution, you can host all your services on Azure Dev Spaces. Then you can live debug on Azure on a space created for yourself.

Azure Dev Spaces supports development with a minimum development machine setup. Developers can live debug on Azure Kubernetes Services (AKS) with development tools like Visual Studio, Visual Studio Code or Command Line.

Is Dev Spaces for Testing?

Azure Dev Spaces are meant for development. Developer testing can be done, and it reduces the chance of finding many issues during the integration testing after the solution is deployed for QA testing. But Dev Spaces is not the ideal solution for end to end regression testing. VSTS supports plenty for that.

How it works? Connection journey

Lets look at the figure above. The Front End connects to Customer API and the Customer API connects to the Service 1. When a user browses through the URL given above, it comments and route through the all stable versions of the services deployed in Azure Dev Spaces.

Then consider a situation where Malin needs to customize the ‘Customer API’ service. Azure Dev Spaces offer the ability to create multiple ‘Spaces’. Developers can create spaces specific to themselves or based on sprints. Then they can host the services that are needed for customizations. By default, a space named ‘default’ is created.

Customer API Malin’s Version

As shown in the image above, Malin’s space can be accessed with a specific prefix ‘malin.s’ to the URL. When a request is made through the above URL, Azure Dev Spaces will check for services hosted in ‘malin.s’ space. As it is only the Customer API that is hosted in that space, request will consume the default space for other services.

You can have many developers create different spaces for themselves. They all will have their own URLs. Once all the development is done, they can test and check-in the code to the source control and host the customized version of the service to the default space. Then the other developers will also get access to the customized service for development and testing.

This way it will improve the developer experience and ease the development efforts. As of now Azure Dev Spaces is provided as a service free for 12 months.

About the Author:

Malin De Silva is a Top Rated Freelance Azure and SharePoint Solutions consultant on Upwork with 100% Job Success Rate. He was also an MVP for Office Servers and Services during 2016-2018. Occasional blogger and also a frequent speaker at local and regional user groups and forums. 

The post Why Azure Dev Spaces are so promising? appeared first on European SharePoint, Office 365 & Azure Conference, 2019, Prague, Czech Republic.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Integrate SAP with PowerApps and Flow

Check out our new SAP integration Scenario with Microsoft PowerApps and Flow.

SAP integration with Microsoft PowerApps & Flow - YouTube

In this business scenario material master data can be maintained and created from a PowerApp. All changings are posted to the SAP system via Microsoft Flow. The connectivity with SAP is realized with cloud connector ECS Core.

Why process integration?
Handling transactional SAP processes from other platforms or employee portals than SAP can bring a lot of advantages. Most users don’t need or want to spend time in the SAP GUI, they want an easy, intuitive and user-friendly user interface. Training users for SAP-centric processes is costly and time consuming. Platforms such as SharePoint or Office365 are much better suited for handling such processes. Apps running there are embedded in a collaborative ecosystem employees are familiar with. Furthermore users should be only involved where it matters here, the rest is all automated, that’s the basic idea. If implemented properly, development and training costs, as well as maintenance and support tasks are much lower there than in SAP. In the end you save time, nerves and money.

Why integrate with PowerApps?
SAP is not the best platform for business process modeling (BPM). Even simple, predefined workflows such as leave or travel requests can only be adapted through complicated coding. That’s why BPM and workflow solutions such as PowerApps, Flow and Nintex are an interesting alternative here. These tools were specially invented for this topic and are very powerful and flexible when it comes to building a process. A huge amount of connectors are available in Microsoft Flow for example which allows countless use cases such as the example shown in the video and many more. All this is possible with low or no code and, once created, the apps can easily proliferated to as many users as required.

Advantages of SAP integration with Theobald Software
In general all SAP connectors have a really straight-forward integration, the tools are easy to install and to configure. Furthermore no installations or add-ons are necessary in SAP. For process integration, pre-built scenarios for common use cases and different SAP modules (Sales, Procurement, HR,..) are provided by Theobald Software what makes it easy for customers to get started. The business content covers processes such as creating and changing sales orders, purchase orders, goods receipts, material master data and many more.

Interested in this scenario?
If you are interested in this use case or SAP process integration in general, please don’t hesitate to contact us under info@theobald-software.com.
You also find the video in the Theobald Software Youtube channel.

About the Author:

Christian is the first point of contact for our customers on any matters having to do with SAP / SharePoint integration. Apart from that he lends assistance to technical support. He has a qualification in administrative studies. Before joining the Theobald team in 2015, he had spent some years as an SAP applications consultant. A native of Esslingen, he favours outdoor pursuits when it comes to free time – on trails, at marathons, cycle racing or mountain climbing.

Reference

Tauchmann, C. (2019). Integrate SAP processes with PowerApps and Microsoft Flow. Available at:
https://blog.theobald-software.com/2019/07/04/integrate-sap-processes-with-powerapps-and-microsoft-flow/ [Accessed 11th July 2019].

The post Integrate SAP processes with PowerApps and Microsoft Flow appeared first on European SharePoint, Office 365 & Azure Conference, 2019, Prague, Czech Republic.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Just 17% of employees in the UK Tech sector are women and I am one of them. Your first question is likely to be what does it feel like to work in such a male dominated sector? My honest answer is I don’t always notice I am a woman. I tend to think it’s just coincidental that I’m female, what’s more important is I am capable and confident.

My journey in tech started in the 90’s when I switched my University degree course from one including Physics and Chemistry to do Computer Science and Maths. It was partly because I didn’t totally ‘get’ degree level Physics and I was allergic to everything in the Chemistry labs. Quite funny when I look back at that time. I’ve since worked in several roles in IT ranging from Network Account Executive, Software Tester, Service Desk 2nd/3rd line to Technical Consultant and a few in between. All these varied roles have given me a 360 degree view of the technology field and I routinely draw on past experiences when faced with customer challenges.

My passion for technology grew the more I worked in tech roles. I’ve always seen technology as a game changer and something that could be used for good. To this end I’ve started a charitable initiative called Pocket Angel, which is an app to help the general public gift homeless people essential items such as food and accommodation. It is set to launch in autumn 2019 in Brighton, UK and I’m very excited to see it grow and be used in other parts of the country and the world.


Alcia and members of her work team plus a group of Marketing students from the University of Brighton called the Bright Young Things who completed marketing work related to the Pocket Angel partner event to receive real world experience.

As a woman in tech I am aware that not all jobs for women provide the benefits I enjoy by virtue of working in a tech savvy company. My employer has embraced the advantages of working from home; provided me with access to online training for continuous learning; allowed me to use state of the art technology which helps me be more efficient, productive and achieve work life balance. I want other women to benefit from working in a tech company. To help promote tech roles to those outside the industry I volunteer at work as a board member for the Tech Women program across Europe. The program encourages women to consider a technical career path and get to talk to women already working in those roles.

I would recommend that more women make that first step to get to know more about the tech sector and explore the various options to get into tech roles. I did a lot of self-study courses in the evenings when my children were young to refresh and skill up myself in preparation for the time when they would be in full time school. I sat the exams and invested in my future. In the end it paid off and I am now a Technical Consultant in a large tech company. So as you probably realise now, I am in my dream job and I hope if this starts you thinking about making a change to a tech job that you too will land your dream job in tech.

Alcia presenting

Sources

https://www.recruitment-international.co.uk/blog/2017/08/only-17-percent-of-employees-in-uk-tech-sector-are-women-research-reveals

The post Finding Your Dream Job in Tech, A Story of How I Found Mine…. appeared first on European SharePoint, Office 365 & Azure Conference, 2019, Prague, Czech Republic.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

As more organizations start utilizing Microsoft Teams, there are increasing concerns around how to properly manage all the information that users are generating. In our recent webinar, we provided the basic steps of how to get information management up and running simply and quickly. The key is breaking the problem down into bite-sized, actionable chunks!  

1. Map and Understand Your Information 

The first step is to map and understand your information at a high level. Understanding the different types of collaborative areas you have in your organization should help you drill down into the specific information contained within each. The chart below provides a visual on what that may look like:  

Map your information 2. Implement a Classification Schema 

The next step is defining the scheme you want to use to describe your information. This can be in the form of a business classification scheme, a taxonomy, or a file plan. This schema is ultimately going to be the terms that you use to tag your information. Classifying your information will allow you to sort and organize it with labels that make surfacing and protecting pertinent data simpler.  

Expanding tags and labels
3. Assign Actions to Terms and Deploy Across Office 365 

Once you’ve created clearly defined terms, you must then associate them with outcomes and actions. These can be a single action (e.g. destroy after 7 years) or it could be a more complex lifecycle (e.g. move to a new location, declare an item as a record, and then destroy it). Essentially, you want to map your information’s journey to make it easy to track.  

Assign actions

This includes being able to push out the terms and their associated actions to the locations where the relevant information will be saved. This is where you link your records management processes to your information architecture, thus ensuring the content will be classified on capture and therefore managed immediately.  

4. Streamline the User Experience 

After you have your information mapped, the next step is to automate the process. This will naturally make it easier for end users to do the right thing. End users also don’t want to be records managers, so try to set defaults and allow the system to remove the burden of those traditional records tasks wherever you can.  

Streamline the User Experience
5. Maintain Compliance and Integrity 

Now that you have your information mapped and automated, you still need to monitor it! Use reporting and auditing to maintain oversight of your system (and make tweaks and adjustments where required). This is where you can ensure record integrity and authenticity.  

If you want a more in-depth look at the process and to ensure your Microsoft Teams information is being properly managed, register for the on-demand version of our webinar! 

Reference:

AvePoint, (2019). 5 Best Practices for Microsoft Teams and Information Management. Available at:
https://www.avepoint.com/blog/microsoft-teams/microsoft-teams-management/ [Accessed: 10th July 2019].

The post 5 Best Practices for Microsoft Teams and Information Management appeared first on European SharePoint, Office 365 & Azure Conference, 2019, Prague, Czech Republic.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Description

If your business is relying on manual processes to classify vast volumes of unstructured data, then you likely aren’t making a dent. Enter: an AI-based data classification solution, which has numerous benefits for your enterprise.

It’s a statistic you’ve likely heard before: about 80 percent of enterprise data is unstructured, meaning it exists in formats that cannot readily be utilized for making key business decisions, meeting compliance requirements, and other important business needs. To achieve these critical objectives, data classification has become a business imperative. But have you ever stopped to consider just how large a task manually classifying all that unstructured data might be?

The Data Explosion

It’s estimated that we are generating 2.5 quintillion bytes of data each day – and that number continues to grow. By next year, it’s expected the rate of data generation will reach more than 146 GB per day per person on Earth.* For individual organizations, the level of data generation is staggering. From banking and insurance sectors to energy and life sciences, the data generated from millions of customers and research initiatives is resulting in similar explosions.

This surge means the task of data classification vastly exceeds human capabilities. Knowledge workers spend a bulk of their time simply discovering and preparing data, but the fact that the majority of enterprise data remains inaccessible highlights how business data classification needs cannot be met via human means alone.

Automated document classification is key to driving the improved business insights and reduced compliance risks that can be achieved through the reduction of unstructured enterprise data.

Here are four key ways document classification using machine learning can help your organization keep up with growing data classification needs.

  1. Automated data classification can improve the accessibility

Within the volumes of content that enterprises create every day, there exists heaps of valuable business information – and lots of data that, at best, is taking up space, and at worst, introduces errors and skews insights. At least a third of enterprise data is useless due to the fact that it is redundant, obsolete or trivial (ROT)** – but because ROT tends to be embedded within all other organizational data, it would be virtually impossible to perform the level of detail-oriented cross-checking that would be required to identify and eliminate it manually.

For computers, however, searching within countless data sets to find and filter ROT is an automatable task that can be performed to a high degree of accuracy. As part of the progressive data classification process, machines are able to identify and weed out ROT and improve the accessibility of high-quality data.

  • Automated data classification fuels productivity and ROI

As previously stated, knowledge workers can spend most of their time discovering and preparing data – and even then, most organizations still can’t make a dent in their volumes of unstructured information. And considering how fast new volumes of data are being generated, organizations would have to shoulder the salaries and related onboarding costs to hire more employees to manually keep up with the demand.

Adding to the financial toll of manual data classification, opportunity costs can arise from an inability to quickly and efficiently leverage new data in making business decisions. Businesses can also incur compliance penalties and other tolls from having unidentified (and thus unsecured) personally identifiable information (PII) in their fileshares. The accuracy of automated document classification over manual methods can significantly reduce these costs and risks, instead fueling productivity and increasing the ROI of data-based tasks.

  • Faster, better identification of risky data

Whether due to data breaches or compliance regulations, unidentified PII is a ticking time bomb for any organization – and the faster enterprises can identify PII within their data stores, the faster they can take steps to secure this sensitive information and reduce any associated risks. From legacy paperwork to emails, unstructured data is often ridden with PII, thus making better data classification a business imperative.

The rapid pace of automated document classification means enterprises have a means for efficiently crawling data and identifying PII, reducing their risk even as new data is generated. By crawling all existing document repositories, and converting unstructured documents into searchable files, businesses can readily determine where PII is located within their data stores. The classification of such data further improves organizations’ ability to assess and address sources of PII, deleting documents of no value that contain sensitive information, redacting PII in instances where there is no business use for said content, and securing PII that must be retained.

 Automated data classification presents enterprises with a consistent, scalable and ongoing method of addressing the growing business risk related to rogue PII.

  • Accelerate business decisions

Whether it’s processing insurance claims or using live customer data to identify new business growth opportunities, document classification using machine learning can help to accelerate better business decisions. This is due to the difference in pure processing speed – while it can take humans hours and hours to comb through data for needed information, automated data classification increases an organization’s ability to rapidly and efficiently access accurate, high-quality data – meaning they can use that data that much faster.

Wrap Up

Enterprises today require rapid data classification capacity to enable access to critical business information. But if you’re relying on manual processes to classify your unstructured data stores, then you likely aren’t making a dent – and may be missing opportunities to glean valuable insights and reduce compliance risks in the meantime.

Schedule a live demo to see first-hand how Adlib can help you achieve intelligent information governance and content clarity with document classification using machine learning.

References:

*Rate of data generation

**ROT information

The post Four Advantages of AI-Based Data Classification appeared first on European SharePoint, Office 365 & Azure Conference, 2019, Prague, Czech Republic.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By now, you’ve heard of Microsoft Teams, the cloud-based messaging and collaboration tool available as part of the Office 365 suite. As one of the newer kids on the block there’s a lot to discover to make your working life more productive, efficient and collaborative.

Aside from all the obvious benefits with collaboration tools, like chat, video, meeting and file sharing capabilities, Microsoft Teams comes with functionalities that can make a real difference to your workday.

Here are our top tips to help improve your experience using Microsoft Teams:  

Customize your notifications

If you belong to a lot of active channels, like me, you don’t want to be interrupted with alerts every time someone makes a comment or mentions you. Customizing your notification settings is easy. Simply click on your profile picture in the top right corner and select the Notifications tab, where you can change the alert type and frequency.

Customize your notifications Integrate

Every workplace is different, and every workplace uses a different combination of apps and tools that suit their needs. The Teams platform allows you to integrate those other apps tools to help you collaborate more effectively and get work done faster. There are a bunch of programs are already available through the Teams Store and Microsoft regularly release new APIs to increase their integration capacity even further. Click on the Store icon on the bottom left corner to get started.

Click the store icon Command more

The search bar at the top of your window, also known as the command box, helps you save time by quickly find what you’re looking for. Some of my favourite shortcuts include:

  • /whatsnew – This command takes you to the Release notes tab. Given there are updates to Teams at a fairly frequent rate, this one is really useful to help you stay up to date with the latest features and capabilities.
  • /goto – If you’re a member of a lot of teams and channels this command is handy for finding the right one with ease.
  • /files – Can’t remember which channel that file is saved in? No problem – find, view and edit all your files for the search bar with this command.
  • /mentions – See everywhere you’ve been mentioned so you can easily view and prioritize what needs your attention.
  • /help – as the name suggests, if you’re having any issues this command gives you a shortcut to the Teams help center. 
Break down language barriers

As organizations become increasingly global, the need to communicate effectively and efficiently across countries, cultures and languages is becoming more important. With Teams you can translate messages in different languages with the click of a button. Just click on the ellipses on the message in question and select Translate. As the name suggests, it will translate the message into your default language. So, this…

Translate

Becomes this…

Background Blur

This is a funky function. We’ve all been in a situation where we have to take a video call in a less-than-ideal setting – maybe you’re working from home and don’t want that external stakeholder seeing pictures of your last holiday on the wall behind you, maybe you’re in a busy office and don’t want people distracted by what’s going on in the background, maybe you don’t want to risk any confidential information accidentally being shared. The Background Blur function literally blurs your background and makes you the focal point on your video call. To activate, simply click the ellipses from the control panel on your call and select Blur my Background:

Blur my background

And like magic:

Magic

And these are just the tip of the iceberg. Keep an eye on the Microsoft Teams blog (or command /whatsnew) for all the latest releases, tips and tricks to help you get the most out of your Teams usage.

Is your organization thinking of rolling out Microsoft Teams? Watch our webinar on The Journey to Microsoft Teams.

Here are some other useful resources that can help you make your deployment a success:

About the Author:

Kate Daley is the Content and Social Media Manager at IR. She developed a love of writing at an early age and has honed her content development skills in a variety of roles in both Australia and the UK.

IR provide performance management and analytics for enterprise communications, collaboration, and payment systems. To find out more about IR, visit www.ir.com.

Reference:

Daley, K. (2019). Top Tips for Using Microsoft Teams. Available at:
https://www.ir.com/blog/communications/top-tips-for-using-microsoft-teams [Accessed 8th July 2019]

The post Top Tips for Using Microsoft Teams appeared first on European SharePoint, Office 365 & Azure Conference, 2019, Prague, Czech Republic.

Read Full Article

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview