Loading...

Follow Redgate Software Blog on Feedspot

Continue with Google
Continue with Facebook
or

Valid

Bloor has published its 2019 Market Update for Test Data Management, listing Redgate as an innovator, and scoring SQL Provision 4.5 stars out of 5 for test data provisioning.

If you’re not familiar with Bloor, it’s an independent research and analyst house founded to help organizations choose optimal technology solutions.

As database teams grapple with shortening release cycles and tightening data protection laws, the need to deliver realistic and compliant test data to development quickly and safely is greater than ever. Choosing the wrong approach can hamper development, drive down quality, risk non-compliance, and lead to escalating infrastructure costs.

In this post, I’ll summarize some of the key findings from the report and shed some light on that all-important score of 4.5 for SQL Provision.

In the guide, Bloor defines three primary methods of test data management: data subsetting, data virtualization, and synthetic data generation:

Data subsetting consists of taking a subset from one or more production databases, usually of a much smaller size than the database(s) as a whole.

Data virtualisation has a similar motivation to data subsetting, at its core: take large production databases and make them easy and efficient to distribute and test with. However, where data subsetting does this by reducing the amount of data being bandied around, data virtualisation does it by allowing you to create virtual copies of your databases.

Synthetic data generation breaks with data subsetting and data virtualisation by opting to disregard your production data for use as test data. Instead, it allows you to create your own ‘synthetic’ test data in an automated fashion.

The analyst finds that the reduced size of the dataset achieved with data subsetting brings advantages in terms of ease and speed of distribution – but also that the method brings challenges around ensuring the data subset is realistic and referentially intact. With data virtualization, however, he finds it provides the ability to achieve a small and lightweight dataset that is also fully representative.

When it comes to synthetic data generation, he describes the main advantage is its failsafe way of ensuring no sensitive information is present (because the data is complete fake). But like data subsetting, arriving at a realistic dataset can be challenging.

For data subsetting and data virtualization, sensitive data will need to be obscured as part of the process, and the analyst mentions static data masking as a way to achieve this.

In terms of the market trends driving test data management approaches, the analyst highlights the extreme importance of sensitive data thanks to GDPR as well as upcoming regulations, and notes that this is driving both data masking and synthetic data generation capabilities.

An important trend for database development teams is the increased emphasis on test data provisioning, as opposed to merely test data management, driven by DevOps and Agile development practices:

The idea is to provide not only a way to create test data, but a method of distributing it effectively and efficiently to your testers, often by means of self-service. The advantage here is a significant improvement to the tester experience and to testing efficiency, thus (one hopes) preventing test data as a whole from becoming a bottleneck to your continuous testing, test automation, or DevOps pipelines … Test data provisioning particularly benefits from a data virtualisation capability.

Redgate is listed as an innovator in the report, and SQL Provision achieves a score of 4.5 stars out of 5 for test data provisioning. The accompanying InBrief explains the reasoning behind this:

Redgate’s entire approach hinges on two concerns: compliance (with existing mandates, such as GDPR, as well as forthcoming regulations) and DevOps. Effective test data management is essential for achieving both of these. Desensitising your test data is necessary for compliance with a variety of mandates, as well ensuring data privacy and security (protection from data breaches, for instance), while timely provisioning of test data – delivering the right test data to the right place at the right time – is an important component of any DevOps pipeline. Consequently, SQL Provision provides both of these capabilities. 

Moreover, SQL Provision does so using a combination of database cloning and data masking. This has some clear advantages over competing approaches, such as data subsetting or synthetic data generation, most of all that it guarantees that your test data will be representative. It also makes provisioning that test data fast and easy.

What’s more, Redgate is uniquely positioned in offering a solution based on database cloning to the mid-market, whereas competing products tend to be targeted at the high-end.

Organizations such as PASS and KEPRO, are already taking advantage of the benefits a combined database cloning and data masking approach brings. By implementing SQL Provision, they’ve been able to solve the major challenges involved with provisioning realistic and compliant test data to development quickly and safely.

SQL Provision also opens new opportunities for database teams, such as enabling the transition from shared to dedicated development environments, shift-left testing, and refreshing data on-demand through self-service and automated processes.

It’s good to see that the advantages it brings to database development have now been recognized by an independent analyst.

To learn more, download your free copy of the Bloor 2019 Market Update for Test Data Management.

The post Bloor 2019 Market Update for Test Data Management lists Redgate as an innovator appeared first on Redgate Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

There is constant pressure in software delivery to release at speed and often. To take an idea or fix and deliver it into the hands of customers in as little time as possible. However, releasing faster isn’t beneficial if what you’ve developed is of no value to the customer or business or, worse, contains errors.

The changes being made need to be tested both in business sense and in functionality. Without appropriate testing in place you could, for example, risk code smells creeping into production, which will either result in costly downtime or errors which are not easily spotted at first but impact all code created afterwards, jeopardizing business performance. This means teams are spending more time on rework rather than adding value.

This is why you should consider implementing database unit testing as part of your company-wide standardized testing practices.

Introducing good habits – test-driven development

Testing code as a part of continuous integration in application development is fast becoming standard practice, as noted in The ROI of Compliant Database DevOps. Yet, while unit testing is regarded as a best practice, it is not something that is commonly carried out in database development. Unit testing isn’t just an extension of DevOps, however, but a practice that should be introduced everywhere as standard.

This behavior changes the way developers approach coding in general, improving the overall quality of what is being delivered before automation is even introduced. Errors and breaking changes can be caught early, preferably in sand-boxed development environments, before making it further up the release pipeline.

Test-driven development (TDD) is about teams fully exploring the problem they are trying to solve before they write the code itself. How are they approaching the problem, and structuring the solution? What are they trying to achieve and, importantly, what does success look like? These changes are validated in development, both in business sense and functionality, and are ready for more rigorous testing in the appropriate test and QA environments. As a result, releases to production are more likely to be valuable and less likely to cause any immediate or underlying issues.

When we talk about introducing a company-wide standardized testing practice, this doesn’t mean a generic set of tests to be run for every change. Rather, it’s a standardized method of testing and providing the appropriate tools to development teams to write their own tests.

Feedback loops

Having dedicated testing or QA environments within the development pipeline is a common approach to testing database code before it reaches production. Development happens in dev. Testing happens in test. By removing the segmentation, however, vital feedback can be made immediately available, proving the value of any prospective change or allowing teams to make the necessary amendments there and then.

With no delay, developers are working more effectively and not putting new tasks on hold while waiting on validation of work they deem ‘completed’.

Unit testing is not a replacement for test or QA, but rather another layer of testing to work alongside the existing practices for more rigorous or specialized testing that occurs further up the pipeline as normal.

Shifting left – the foundation for automation

According to the 2019 State of Database DevOps Report, just 19% of participants practice test automation in their database development. This could be because moving to full automation can be too big a jump if the proper foundations are not laid first.

Creating a standardized approach to testing will allow for a smaller and smoother move to automation when ready. Without having appropriate standards in place, errors can slip through into production at a faster, more frequent rate. More time would be spent bug hunting, unpicking code or troubleshooting, and the benefits of automation can be lost in reactive rework.

It’s important to note that when reaching the state of automating tests, multiple disparate methodologies are not overlapping when approaching the same problem. Take a step back to review and put processes in place upfront.

Unit testing is an important part of a standardized, reliable process because it aids developers to ‘shift left’. It introduces known, quantifiable, and repeatable practices in order to catch errors and improve the quality of the code they are delivering. When automation is introduced, outputs are readable, so catching and fixing errors are much easier.

Changing behaviors is challenging but if teams are supplied with the correct tooling, it is easier to introduce good habits and make that shift. For example, SQL Test equips developers with the ability to write their own tests without the need to learn new technologies, and work in a language familiar and comfortable to them.

Summary

Database unit testing can be implemented alongside other standardized development practices such as version control, defined company-wide coding styles, and dedicated development environments. In conjunction with these practices, development teams are turning out higher quality code, reducing the risk of errors and subsequent downtime, and, importantly, collaborating to produce innovative valuable releases for the customer.

If you’d like to know more, this blog post explains the four steps to laying the foundations for standardized database development, and outlines the best practices for doing so.

Ready to discuss how to standardize your development processes? Take a look at our solution pages or get in touch to discuss your business challenges and requirements.

The post Database unit testing: Setting your team up for valued software delivery appeared first on Redgate Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Development and IT departments face increasing demands to deliver changes faster, often with increasing complexity and less tolerance for downtime, and without a corresponding growth in team size to handle the extra work.

Standardizing database development practices by establishing coding standards can help achieve this by removing blockers to understanding code, easing the implementation of new code or processes, and improving code quality, so less time is spent on maintenance or fixing mistakes in the future.

Improving the readability of code

Setting standards for code writing, formatting, structure, and style means anyone, even those not immediately familiar with the database code, will be able to start understanding the code easier. Developers will pick up and write code faster, with fewer blockers from unclear code or processes making teams more productive, resilient to absences, and able to accommodate flexible or remote working arrangements.

Lastly, you’ll also be able to scale teams more quickly as hiring or changing team configurations becomes easier, with code standards simplifying onboarding, allowing you to adapt to emerging risks and opportunities, whether it’s to fix a critical bug before launch or explore a new feature.

Raising the quality of code

Bugs in code are an ever-present risk but introducing code standards reduces this risk and mitigates the harm when they inevitability arise.

Setting standards such as rules on code quality and best practice will minimize the time needed to correct errors while also making new code integration and long-term maintenance less taxing. Even so, mistakes will still occur and regular, automatic checks of code are important to identify them before they reach users. For example, peer code reviews will be more effective with standardized code styles, and setting up static code analysis against agreed code quality rules will help enforce code standards and find errors.

Simplifying the introduction of new technologies and processes

By keeping your code within set standards, introducing other process and technologies becomes faster and easier throughout your organization. Whether reworking your delivery pipeline or instituting quality control with code analysis, everything will go smoother when people know what to expect in the code.

Additionally, once a new tool or process is found useful by one team, it will be easier to share the benefits throughout your organization without losing time on duplicating work.

What to standardize

Code standardization covers a broad range of topics and the exact areas you’ll want to standardize and to what degree will vary between organizations, depending on the priorities, technologies, and practices in place. To help get you started on where you should be standardizing, common targets include:

  • Code formatting and styles
  • Universally agreed naming conventions
  • How information on what the code does and why changes are made is documented
  • Code quality rules and best practice
  • Code testing and review procedures

Third party tools like SQL Prompt, which takes care of code formatting and uses static code analysis to identify errors in code as it is typed, can play a part, but you’ll also need input from across your organization.

This can be daunting but many of these coding best practices are already in place for application development, and 77% of application developers are also responsible for database development, as reported in the 2019 State of Database DevOps. Now is the time to bring these practices across to your database development as well, taking advantage of the knowledge already in your organization but not being fully utilized.

Summary

Standardizing your database development by introducing coding standards will ensure that your teams will keep up with increasing demand and deliver higher quality results. Whether by removing blockers to understanding your codebase, easing the implementation of new features or processes, or reducing the burden of maintenance and quality assurance. IT and development teams will thus be able to deliver value more frequently, while minimizing costly mistakes and downtime.

By combining code standards with the other aspects of standard database development practices, you can compound these benefits and lay the foundation for further improvements across your whole database development process.

Standardizing code is just one of our four steps to standardized database development that lay the foundation for DevOps. If you want to know more, take a look at our solution pages or get in touch to discuss your business challenges and requirements.

The post Enhancing your database development using coding standards appeared first on Redgate Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Diversity is one of the most prominent issues faced by digital businesses today. Research from industry body Tech Nation shows that women in IT are outnumbered by men 4:1 and make up just 19% of the UK digital workforce.

The lack of young women taking up IT courses is a big cause of this imbalance. According to a report from the Royal Society, only 20%% of GCSE computing students and 10% of A-Level computing students were girls.

Clearly, there is a grass roots problem and more needs to be done to show young people, especially young women, that the technology industry is full of great opportunities.

Defeating stereotypes

Kate Martin, Business Operations Associate at DevOpsGroup, believes that poor diversity in tech stems back to the education system. “When I was in school, I can’t remember being told about the opportunities available in the IT industry. Technology just isn’t promoted as a subject like English or history, but nowadays it should be,” she says.

“Although I work in a technology role today, I didn’t study computer science in school, college, or university. It just didn’t feel like a natural option at the time. Eventually, I went on to do an engineering degree. I’d say I stumbled into the industry by accident, but my skills were transferable.”

Throughout her career, Kate has noticed a clear gender imbalance and says tech is still viewed as a male-dominated world. “When taking part in a graduate scheme within the technology arm of a large organization, there were five of us but I was the only female. I did feel like the token girl at times – even being told to dress in a certain way,” she says.

“To me, the issue is that women are deterred from pursuing technology careers due to stereotypes. However, there are so many women already forging successful careers in tech, and there’s a wide range of opportunities available. As an industry, we just need to get better at highlighting them and show girls that tech is an exciting area.”

Starting a conversation

Lucy Young, People Success Associate at DevOpsGroup, has been working in technology for around four years. Like Kate, she didn’t originally plan on developing a career in tech. “I initially started out doing general admin work at a solicitor’s firm, but I never really enjoyed that. It just wasn’t really a passion of mine,” she says.

“However, I’ve always had an interest in technology and ended up finding an opportunity in the sector. In my last workplace, there were around fifty people in the IT department but only four women.”

Her view is that organizations need to talk about diversity more and work together to find solutions. “During my time in technology, I’ve noticed more men working in the sector. And even now, you can still see that it’s a common thing. We need to be aware of this and try to bridge the gap.”

“A big part of my current role is improving diversity in the workplace. Talking about the challenges in the industry is crucial. At DevOpsGroup we’re in the process of setting up a diversity working group to raise awareness and eradicate the gender gap. We also work closely with organizations such as Chwarae Teg, the charity pioneering gender equality, to explore different ways we can improve our diversity. This year, we were awarded with their prestigious Exemplar Employer Award for our efforts.”

Promoting tech

Katherine Axten, an engineering intern at DevOpsGroup, developed a fiery passion for technology at a young age. “My dad has always worked in tech, and he encouraged me to do more techie things. When I was younger, I used to take PCs apart and had a lot of fun in the process,” she says.

“After leaving school, I didn’t go for a career in tech straight away. But a couple of years ago, I took a programming course online and really enjoyed it. Then I decided to take it up at university. I guess I got into it by trying different things, but my dad’s encouragement was the biggest motivating factor.”

While Katherine has been immersed in technology from a young age, she believes that not enough is being done to position technology as an attractive career opportunity for young women.

“When I was in school, there were only three girls taking IT at GCSE. And even now, I don’t feel like there’s much encouragement. Speaking to old colleagues about my plans to study programming at university, they were all really surprised and were like Oh, but you’re a girl,” admits Katherine.

She suggests: “It would make a such a difference if female role models went into schools to talk about the opportunities in the industry. There are so many great women working in technology, and they’re inspiring. For a long time, technology roles have been viewed as something men would traditionally go into. That’s changing as we highlight the women in tech, but we need more encouragement that it’s okay to pursue a career in something which is not considered to be a traditional path.”

The gender gap in technology isn’t something that can be eradicated overnight. But this can be achieved over time by the industry and education system working together to highlight the opportunities in tech and break down age-old stereotypes.

To learn more about the great women working at DevOpsGroup, check out our team page.

With offices in Cardiff and London, DevOpsGroup deliver IT transformation at the speed of disruption, by building DevOps capabilities within its clients, enabling businesses to continually meet the relentlessly increasing demands of delivering great digital customer experiences. To find out more, visit DevOpsGroup.com.

The post Diversity in tech – how can we close the growing gender gap? appeared first on Redgate Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

For most application developers, it’s unthinkable to work without version control. The benefits of tracking and retaining an incremental history of code changes are long understood in the world of software development. No surprise then that the overwhelming majority of respondents in our 2019 State of Database DevOps survey confirmed they’re already using this practice for their application code.

But it was a different picture when we asked about database version control. Only 55% of those same people stated that they used version control for their database changes. In a way it’s understandable, as database version control was, for a long time, seen as unfeasible. But now that’s no longer the case, it’s time the database was treated in the same way as the application.

Standardizing processes across teams, projects, and across database and application code unlocks significant quality improvements and timesaving, as this blog post explains. It outlines the benefits of standardized team-based development and talks about the four steps you can take to introduce it.

It’s important because databases are increasingly in the spotlight due to the recent proliferation of data breaches, a multitude of new regulations, and the increased pace of database development. In this climate, the need for an incremental history of changes and more efficient processes is more compelling than ever, so if you’re not already versioning your database code, here are some of the reasons why you really should be, and some of the benefits you’ll discover:

1. Ease collaboration across distributed teams

Putting database code into a version control system makes it much easier to share code changes and coordinate the work of the various team members who are responsible for the database. The ability to rapidly share and manage changes makes it particularly important for teams based in different locations, and evidence shows that teams are increasingly distributed. The Stack Overflow 2019 Developer Survey, for example, found that the majority of developers work remotely more than once a month, and 12% are full time remote workers​.

With SQL Source Control, team members can choose to work on a shared database, safe in the knowledge that they won’t be over-writing each other’s work. With features like object locking, conflicts are easily avoided and developers can check code in and out with confidence.

Once the database code is in version control, the path is also paved for adopting a dedicated development environment approach. Giving each developer their own dedicated, up-to-date copy of the latest version of the database brings in the freedom to try out new things, risk-free.

2. Gain better visibility of the development pipeline

A version control system provides an overview of what development work is going on, its progress, who’s doing it, and why. It also maintains detailed change histories and can be associated with issue tracking systems. For example, SQL Source Control lets you associate database tasks with Microsoft’s Azure DevOps Server work items so that you have a complete view of your workflow.

3. Have the ability to roll back or retrieve previous versions of the database

While you should always have a reliable backup strategy in place, getting a database into version control also provides an efficient mechanism for backing up the SQL code for your database. Because the history it provides is incremental, version control lets developers explore different solutions and roll back safely in the case of errors, maintaining the referential integrity of your database while giving you a risk-free sandbox. SQL Source Control makes it easier by allowing users to simply roll back and resolve conflicts straight from the Object Explorer.

4. More readily demonstrate compliance and auditing

The change tracking provided by version control is the first step to getting your database ready for compliance, and an essential step in maintaining a robust audit trail and managing risk. Compliance auditors will require an organization to account for all changes to a database, and detail all those with access to it.

New data protection regulations are emerging all the time, and requests to demonstrate compliance or view the history of a database are increasingly frequent. With SQL Source Control, you can look through the full revision history of a database or database object and see exactly who made the changes, when they made them, and why.

5. Lay solid foundations for automating database deployments

The State of Database DevOps Survey also revealed that the task of database development has moved into the hands of the majority of application developers. This is encouraging because version controlling database code opens the door for automated deployments, which is common practice in application development.

This in turn removes the bottleneck created by not synchronizing application and database deployments. Complex processes become easier to automate and more repeatable, and deployments much more predictable because you’re working with a stable version of the database, which is being developed alongside the application. Using code checked into SQL Source Control as the basis for the automated builds and tests run by SQL Change Automation, for example, means that problems are found earlier, and higher quality code is shipped and deployed.

6. Synchronize database and application code changes

The 2019 State of Database DevOps Report found that the top two challenges in integrating database changes into a DevOps process are synchronizing application and database changes and overcoming different approaches to application and database development.

Having the database in version control alongside the application immediately addresses both concerns, because you always know the version of the database being deployed directly corresponds to the version of the application being deployed. This close integration helps to ensure better coordination between teams, increases efficiencies, and helps when troubleshooting issues. To make it easier, SQL Source Control plugs into version control systems used for storing application code changes like Git, Azure DevOps Server and Subversion.

Summary

While it’s true that database version control wasn’t always achievable, the availability of tools like SQL Source Control means it’s now easier than ever to version database code alongside application code across your organization.

Take your next step towards standardizing and automating your software delivery and become a truly high performing IT organization. If you’re among the 45% not yet version controlling your database, hopefully at least one of the six reasons above will have resonated enough for you to explore it further.

Find out more about putting database version control in place with SQL Source Control. SQL Source Control is part of the SQL Toolbelt, the industry-standard suite of tools for SQL Server development and deployment.

The post Six reasons to version control your database appeared first on Redgate Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

I recently did some research on the source of data breaches and in this article, I’m going to talk a bit about my current favorite source for breach information, and what I learned.

Verizon publishes the Data Breach Investigations Report annually and the latest report is the 11th edition, so they’ve had some practice. The free reports are extremely well detailed and, shockingly, they’re even entertaining to read.

The reports don’t claim to discover all data breaches. After all, not all data breaches are discovered, and those that are discovered aren’t necessarily reported.

The 2018 report covers 53,000 incidents, defined as: A security event that compromises the integrity, confidentiality or availability of an information asset.

It also covers 2,216 breaches, which are defined as: An incident that results in the confirmed disclosure — not just potential exposure — of data to an unauthorized party.

These numbers (and the screenshots I’m sharing below), do NOT include breaches involving botnets. Instead, the 43,000 successful accesses via stolen credentials associated with botnets are handled in a special insights section of the report.

Are data breaches caused mainly by insiders or outsiders?

A colleague of mine mentioned that he’d recently seen some numbers suggesting that data breaches were mainly perpetuated by insiders to an organization — but he hadn’t been able to track down the source of those figures or substantiating data. With the number of data breaches we see these days, that’s a pretty dark view of employee-employer relationships!

Here’s what the Verizon report shows in terms of who is behind the breaches:

2018 Data Breach Investigations Report, 11th Edition, Verizon, page 5

These figures are regarding those confirmed data breaches, not all security incidents. While 28% involve internal actors, the bulk of data breaches are coming from people outside the organization, finding their way in by using malware or social attacks, or by exploiting vulnerabilities created due to errors.

Who can a database administrator trust?

For those internal actors involved in data breaches, my first thought was, Well, so WHO WAS IT?

That’s answered a couple pages later. While the exact internal actors weren’t found for all of the reported data breaches, analysis was done for 277 data breaches:

2018 Data Breach Investigations Report, 11th Edition, Verizon, page 9

As much as database administrators like to focus on denying permissions to developers for production, developers were much less likely to be involved in data breaches than system admins. And who exactly are system admins? Well, I’m guessing that includes … the DBAs.

Awkward.

This is remarkable given that you don’t need production access to cause a data breach. It’s pretty normal practice in an enterprise to make copies of production data for use by analysts, developers, product managers, marketing professionals, and others.

Redgate’s 2018 State of Database DevOps Report, for example, found that 67% of respondents use production data in development, test, or QA Environments, and that 58% of respondents reported that production data should be masked when in use in these environments:

The 2018 State of Database DevOps, Redgate, page 12

There are good reasons that production data is spread around like this: performance is extremely difficult to predict when testing with data that doesn’t have a very similar distribution and similar size to production data, for example.

But after many years of working in IT, I know that most often this data is not modified or masked after being duplicated. These environments tend to be far less secure than production environments, and they are a very rich target for data breaches — even if it’s not the developers themselves intentionally causing the data breach.

That’s worrying, given that the rise of malware and social attacks means that all environments in an enterprise can be the source of a data breach. And perhaps a sign that more attention should be given to introducing measures to prevent such breaches.

Kendra Little is a Microsoft Certified Master, a Microsoft Data Platform MVP, and a Redgate DevOps Evangelist. You can find her online at littlekendra.com.

The post Where do data breaches come from? appeared first on Redgate Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

One of the most profound challenges faced by technology companies today is talent. According to the latest Harvey Nash survey, 65% of CIOs believe that a shortage of tech skills is affecting their ability to respond to change.

The study quizzed 3,958 leaders globally and explored the causes of the tech skills gap – particularly Britain’s withdrawal from the EU and US immigration changes. It found that 42% of European CIOs are unsure how Brexit will affect their hiring plans, and in the US, nearly half of IT leaders think stricter rules for the H-1B visa will derail their plans to attract foreign talent.

When it comes to the skills in demand, popular roles include big data and analytics specialists (46%), technical architects (35%), enterprise architects (35%), and security experts (35%).

A growing challenge

Steve Thair, Co-Founder and CPO of DevOpsGroup, says the issue stems back to when the technology industry moved many jobs abroad and shifted to outsourcing as a means to deliver IT services.

“Because of this, there’s been a lack of entry-level jobs over the past few years, and people are struggling to launch their careers in technology. Businesses just haven’t been focusing on supporting the next generation of talent,” he says.

A lack of professionals from different backgrounds is another contributing factor to the growing skills gap. Thair believes that companies are effectively ignoring half of the workforce.

He explains: “According to Tech Nation, diversity is a key challenge for digital tech businesses, and women working in IT are outnumbered 4:1. As an industry, we must do more to break age-old industry stereotypes, promote technology as a viable career option, and highlight the different opportunities available in the sector.

“There is a huge range of jobs now available other than just developers – product management, digital design, test automation, delivery management, data science, and AI just to name a few. We need to promote the diversity of roles to attract the diversity of people.”

Modernizing education

To eradicate the tech skills gap and ensure organizations can hire the right people, Thair says the learning paradigm must change.

“When you look at the rise of sites like Pluralsight, Udemy, Kahn Academy, and the Lynda platform, there’s a huge amount of online learning resources focused on technology, and it’s there because what’s being taught in the education system isn’t relevant to modern software development,” he says.

“With the rate at which technology methods and frameworks are evolving, the current education model simply doesn’t work. Everyone should be in a continuous learning cycle and be constantly evolving their skillsets to remain relevant.”

Ryan Cullen, People Success Lead, says there are many innovative companies pushing the boundaries of technology and making it a growth area for the UK economy. But echoing similar thoughts to Thair, he says the education system just hasn’t caught up.

“Organizations are struggling to hire people who have the right skills and experience to push their organizations forward in today’s interconnected world. The industry should have identified these challenges earlier and demonstrated the value to those starting their educational journeys that a career in technology is something exciting and worth exploring. However, we’re now in a situation where there’s a clear talent and skills shortage.”

Brexit will only add to this skills crisis, especially when it comes to talent attraction.

Cullen adds: “A lot of talented people are worrying about their status after Brexit and may move to another European country, creating a brain drain in the UK. At the same time, Brexit is likely to deter skilled professionals from coming here.”

He adds: “At DevOpsGroup, we’re doing a number of exciting things to eradicate the tech skills gap. For example, through our Academy, we’re working with a range of universities to train the next generation of IT talent. We’ve also teamed up with the charity pioneering gender equality, Chwarae Teg, to improve gender diversity in the industry, and our workforce modernization plan should help re-equip those with skills that may have become dated.”

It’s clear that the technology skills gap is a real issue for organizations globally, and this isn’t something that can be solved overnight. However, bridging the gap between education and business is a crucial starting point. This will ensure the IT pros of tomorrow are equipped with the practical skills and experience to hit the ground running.

Learn more about how we’re trying to close the tech skills gap via the DevOpsGroup Academy.

With offices in Cardiff and London, DevOpsGroup deliver IT transformation at the speed of disruption, by building DevOps capabilities within its clients, enabling businesses to continually meet the relentlessly increasing demands of delivering great digital customer experiences. To find out more, visit DevOpsGroup.com.

The post How can we close the tech skills gap? appeared first on Redgate Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Following the inaugural SQL in the City Summit in London, we’re excited to take the event on tour over the coming weeks, bringing major industry figures together to present in Los Angeles, Austin, Australia and New Zealand.

Redgate’s Summits aim to demonstrate how data professionals can deliver value faster, while keeping data safe, by adopting DevOps for the database.

In an age of increasingly stringent data protection legislation across the globe, this compliant database DevOps approach is gaining a lot of attention and the Summits will feature well-known figures from the DevOps world alongside Microsoft Data Platform MVPs, consultants and IT experts.

At the Los Angeles event on May 15, Donovan Brown, Principal DevOps Manager at Microsoft, will give the opening keynote about the key findings from the 2019 State of Database DevOps Report. Known in DevOps circles as ‘the man in the black shirt’, Donovan wrote the forward to the report and is passionate about including the database in DevOps.

He’ll also be sharing his experiences of what you need to know before launching a DevOps Initiative as part of the panel Q&A alongside Microsoft MVPs Brian Randell and Ike Ellis.

In Austin on May 23, Jeffrey Palermo, CEO of Clear Measure, will talk about what to consider when scoping a DevOps project, how to instigate cultural change to support it, and creating a game plan for a successful proof of concept.

And over in Australia and New Zealand the founder of SQL Down Under, Greg Low, will provide a detailed introduction to Azure DevOps, and show how compliant database DevOps fits into the picture.

Other speakers involved in all three Summits Down Under include Microsoft MVPs Steve Jones, Hamish Watson and Warwick Rudd, as well as Microsoft Data Platform Consultant, Kelly Broekstra.

Redgate’s Microsoft Data Platform MVPs, Kendra Little and Steve Jones, will also be joined by Pre-Sales Engineers and data specialists all aiming to share their expertise of Compliant Database DevOps. Each of the one-day Summits will give Senior Data Platform Professionals the knowledge they need to implement a consistent, scalable, and repeatable process to help their teams keep application and database development in sync, while protecting data at every stage.

Ike Ellis, Microsoft MVP and partner at San Diego software studio, Crafting Bytes, will be speaking at both the Los Angeles event and the Seattle Summit Pre-Con before PASS Summit this November, and comments: “This is a rare opportunity for IT professionals to get a real handle on DevOps and data privacy issues. Every business everywhere is now facing the same challenge to release software faster yet protect customer data, and these events give practical advice from businesses who are already doing it.”

The SQL in the City Summits will be held in Los Angeles, CA on May 15, Austin, TX on May 22, Brisbane, AUS on May 31, Christchurch, NZ on June 7, Melbourne, AUS on June 14 and Seattle, WA on November 4. To find out more information and register for any of the events, visit the SQL in the City Summit event page.

The post Learn from expert industry speakers at SQL in the City Summits in the US and APAC appeared first on Redgate Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Ever since the GDPR was introduced, the subject of data breach notifications has worried a lot of people. How do you write one? What do you need to include? What will the ramifications be? Will it make your customers run for the hills? Will it get you fired?

I’ve got news for you, courtesy of Computing.co.uk, one of the technology websites I subscribe to. They can be polite, informative, and leave a favorable impression. This is an email I received from them the other day:

The email gives the background, the details, the steps they’ve taken to mitigate the risk, the option to reset my password – everything I would want to know in just 286 words. It doesn’t pull any punches, but neither does it come across as an apology on bended knees.

The subject line to the email, incidentally, was Notification of a potential data security breach of your password. Honest, open, and enough to make me read the email – and then be reassured by the content of the email.

So if you’ve been wondering what to do if you have to write a data breach notification, a good first step is to follow the example of Computing.co.uk. They’ve done a lot of the hard work for you.

And, yes, I remain a subscriber.

If you’d like to know more about data breach notifications, there’s a fascinating article by William Brewer on Redgate’s technical journal, Simple Talk.

The post Data breach notifications don’t need to be scary appeared first on Redgate Software.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

In the increasingly demanding world of software delivery, IT teams are feeling the pressure to deliver value to customers – either internally to the business, or external users – quicker than ever. This is often at odds with the day-to-day demands associated with maintaining legacy code, fixing and reworking issues, and trying to collaborate effectively with team members who may or may not be located in the same office, building, or even country.

There there’s the database, which is still perceived as a bottleneck in the development process for three reasons.

1. The shape of the IT team is changing

In order to meet business demands, teams are increasingly distributed across multiple locations, time zones or working patterns. The Future Workforce Report 2018 states: 55% of hiring managers say remote working is more common than 3 years ago, and 53% use flexible workers. This matches similar comments we frequently hear from our own customers.

Without standardized development systems and processes in place, this presents a number of challenges. Among teams, effective collaboration is nigh on impossible, and time and space to innovate is difficult to orchestrate around everyday pressures.

From a team manager’s perspective, new starters are unable to come in and pick up where the last person left off. Projects are also slow off the line and time is typically spent debating the fundamentals like version control setup and consistent naming conventions, which could already be in place to help teams hit the ground running and deliver value sooner. Not only that, a general lack of visibility of team progress makes everyone’s life harder.

2. You’re expected to deliver more with less

To add to this frustration, IT teams are under pressure to deliver more with less resource. They’re running just to stand still, spending time repeating work unnecessarily and having to make risky hot fixes. As a result, skilled staff are wasting time on tedious, manual rework, rather than more enjoyable added-value development. Business are losing out too, with new features that can make them more competitive taking longer to introduce.

3. Database development is lagging behind

For DBAs, having a centralized role supporting multiple teams means they often feel the pressure as they try to deal with a range of changes and issues without consistent structures or formats.

Then there’s the proverbial elephant in the room affecting businesses the world over: concerns about data privacy are blocking developers from accessing realistic data in development and test environments. This leads to performance issues further down the line, thus perpetuating the need for avoidable rework. Without version control and a solid audit trail, team leads also have a lack of visibility over who made what changes and when.

With too many manual steps leading to error-prone development, and conflicting changes that need to be troubleshooted and reworked, teams are unable to respond to business needs fast enough. Ultimately, without a unified process the database is slowing down software delivery and deployments are often risky, causing tension between application and database teams.

There is another way

If this is sounding highly relatable, where do you start? We often hear from our customers: “We’re trying to achieve a digital transformation”, and “We want database automation in line with our application”. For sure, the holy grail may be automated deployments and full compliant database DevOps to enable the deployment of application and database changes in a frequent, reliable and repeatable way. However, getting to that goal requires taking smaller steps first.

If DevOps is a completely new concept for your organization, you may find pushing for full automation right away creates a culture shock that can fail at the first hurdle. In the 2019 State of Database DevOps Report, 60% of respondents said the speed of delivery of database changes, freeing up developers’ time for value added work, and reducing the risk of losing data during deployments were the top drivers for automating the delivery of database changes. Laying the foundations with first-class, standardized development practices, these benefits can be reaped sooner than you think.

Let’s take a look at four key best practices to help you work more efficiently.

Adopt version control

The 2019 State of Database DevOps Report revealed that 77% of application developers are also now responsible for database development. This makes standardizing the way database code is written and version-controlled important in order to reduce errors, with everyone in the team working in the same way from a single source of truth. It also provides an audit trail of the changes made, enabling compliance to be demonstrated, and provides the foundation for automated Continuous Integration and Continuous Delivery processes in the future.

Introduce dedicated development environments

With version control in place, extend the process by equipping development teams with realistic, sanitized copies of production in their own dedicated environment. This will avoid unpredictable performance problems further down the line because developers are working on realistic datasets, while minimizing the risk of overwriting each other’s changes. With the processes in place to provide developers with the up-to-date, compliant environments they need, they can focus on value-added development.

In this article, James Murtagh explains how to ease the transition from shared to dedicated database development.

Standardize coding styles

Next, supply developers with tools to considerably speed up writing code. Ensuring a consistent and uniform coding standard across and within teams allows for potential issues and errors to be caught upfront before they’re promoted up the pipeline, particularly if the tools include static code analysis.

This empowers teams to deliver quality code and frees up their time to value added work. The benefit of consistent, readable code also makes it far easier to collaborate, review each other’s work, onboard new team members to a project, and quickly spot where issues lie.

Adopt unit testing for the database

Finally, unit testing is important as part of a standardized reliable process because it aids developers to ‘shift left’. It introduces a known, quantifiable, and repeatable practice to catch errors in code, and ensures that all developers are able to read how their code will be evaluated and more importantly understand, code and contribute to company-wide testing best practice moving forward. Having an automated readable output at the testing stage makes catching and fixes issues much easier.

It’s also important that once you reach the stage of automating tests you’re not trying to combine multiple disparate methodologies and approaches to what is ultimately the same problem, so taking a step back to review and put processes in place upfront will pay dividends.

Summary

By adopting the industry-standard tools for version control, provisioning and coding, teams will find it easier to collaborate, free up their time to innovate, add value and focus on more enjoyable work. For the business, the foundations for automation and compliant database DevOps are being laid, while speeding up and simplifying team-based database development.

Ready to discuss how to standardize your development processes? Take a look at our solution pages or get in touch to discuss your business challenges and requirements.

The post 4 steps to laying the foundations for standardized database development appeared first on Redgate Software.

Read Full Article

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview