My third and final video for my VidCon 2017 coverage.
Below is my update from Day 2 of VidCon 2017. As you’ll see, while I have experience with recording screen presentations I’m still learning the ins and outs of recording myself as video, especially in the audio department.
Bear with me, I’ve already ordered some new equipment to help in the audio world; as I continue to learn things will only get better. While I definitely plan to make screen capture style recordings (such as my PowerShell and XML video) the bulk of my content, I will be doing live presentations, such as this vlog, from time to time.
In the recording I mention several products I saw, the links for which are at the bottom of this post. I want to be clear that these are not paid promotions, nor did I receive any type of compensation. I was just impressed with them and wanted to share.
To find out more about me: http://arcanecode.me
Social Blade: socialblade.com
Mighty Selfie Stick: bit.ly/10ftSelfie
Katie’s YouTube Channel: https://youtube.com/ktmh9600
My thoughts on the first day at VidCon 2017, the video creators conference. In keeping with the event I did this as a video blog. More to come!
In this, my first video from my YouTube channel, I give a brief overview of reading, changing, and updating XML files from within PowerShell.
If you enjoy it, please be sure to visit my YouTube page and subscribe for more content.
I realized it has been far too long since I updated the look and feel of the blog. If you’re reading this, you can obviously now see the new look and feel.
Over to the right you can now see my Twitter feed. I post multiple stories daily of interest to the tech community.
I’ve also generated a new “About” page:
On it you’ll find my bio, plus links to all my social media accounts, including my github page where all my code samples now reside.
You’ll also find the link to the Facebook site for my company, Arcane Training and Consulting, where I post stories and discussions on a variety of technical subjects including SQL Server Business Intelligence, PowerShell, Azure, Security, and more.
In addition are links to all my Pluralsight courses, as well as the books I’ve co-authored.
Finally, I’m in the process of creating my own YouTube channels. While Pluralsight will host my long form training courses, ranging from one to six hours, YouTube will have very short videos focused on a specific topic. Ideally 15 minutes max so you can get answers quickly. Stay tuned for more as I get these published.
I did say channels, plural, as I’ve decided to make two YouTube channels. One will host only technical material for subjects such as PowerShell, Azure, SQL Server Business Intelligence, and the like.
The second will host videos for my hobbies, allowing me to post videos of a more personal interest such as Minecraft and Ham (Amateur) Radio. This will make it easier for people to digest the topics of interest to them.
I hope you enjoy the new format, and check back often.
With the explosion of digital data, achieving optimum database performance has become the primary concern of every database professional. For improving efficiency when managing a complex IT environment, DBAs must stay one step ahead consistently and learn about the best practices, proven strategies, and innovative approaches being applied to different DBA processes. Here are 5 key areas to consider for driving database efficiency even with an exponential increase in data:
1. Knowing What Needs Your Focus
It is important to have a good fundamental understanding of your IT infrastructure as a DBA. It’s critical to understand what’s working well and what’s not performing within the database infrastructure itself – e.g., if you’re having memory issues vs. I/O issues. It’s also critical that you understand how the database is reached – what network issues, application issues, VM issues could be impacting database availability or performance. Ensure your perspective is broad enough to understand the parts of the technology stack that need your attention.
2. Performing Periodic Health Checks
Database corruption hits without warning and has a devastating impact on your data if you are unprepared. Backups are essential but if you are backing up corrupt data, all your efforts are going down the drain. To prevent such a scenario it is important to perform health checks periodically using a standardized process. As a rule of thumb, DBAs should check and validate the consistency and integrity of a database frequently to make sure there is an accurate, valid backup always available in case the need arises.
3. Fine-tuning SQL Server Performance
The biggest challenge facing any DBA is how to improvise, optimize and maintain SQL Server database performance. When tuning a busy system, considering the full range of KPIs can get downright overwhelming. Use online guides to identify the metrics that actually matter and make improvements accordingly. For example, if you see a sudden fall in page life expectancy, it reflects an increase in your I/O requirements, which means you should be checking the processes running at that time.
4. Staying Compliant
Compliance can take a toll on compute resources, giving rise to on-going stress. While it may seem tempting to monitor every single transaction, it can kill your performance because it would need a large amount of storage space.
It is important to have an audit strategy in place with well-defined data and events before you can start. This approach will help you make any necessary adjustments over the time and track all the results for quarterly and annual audits.
5. Leveraging a Modern Database
The emergence of new generation applications that require both scale and speed to function at peak efficiency has exposed the flaws and gaps in existing database technologies. Scale up has reached full capacity, but scale out has historically been really tough. Modern databases support key features that can boost app performance and improve uptime, but taking advantage of these capabilities has required substantial application recoding.
Database load balancing software makes SQL Server management easy, avoiding the need for code changes to support features at the application tier. It enables geo-aware load balancing, supports app-transparent failover, transparently delivers read/write split, enables query routing, and performs multiplexing and connection pooling, enabling DBAs to tackle the challenges of an ever-growing pool of database servers. Deploying database load balancing lets you harness all the capabilities of SQL Server databases. By deploying database load balancing software, DBAs can efficiently address all the issues and problems that impact their ability to manage and optimize SQL Server databases effectively.
About the author: A self-proclaimed tech geek, with a passion for ScaleArc’s disruptive technology innovation in database load balancing. Tony has a passion for dissecting tech topics such as transparent failover, centralized control, ACID compliance, database scalability and downtime effects. On his days off, he can be found watching sci-fi movies, rock climbing or volunteering.
Disclaimer: This post is not an advertisement. The owner of this blog has received no compensation for the placement of this guest post.
I’ve been working a lot in the Azure PowerShell area of late. One thing I wanted to be able to do is have my scripts login automatically to Azure. In many examples the cmdlet Save-AzureRmProfile was used to save your Azure credentials, then later you could use Import-AzureRmProfile to import them.
But, when I attempted to run Save-AzureRmProfile I got the error ‘Save-AzureRmProfile is not recognized as the name of a cmdlet, function, script file, or operable program’. Huh? I checked the docs, and it does include a listing for Save-AzureRmProfile.
This is a case of the PowerShell AzureRM module getting ahead of the docs. After beating my head against the wall, I found the cmdlets had been replaced with the new noun of AzureRmContext.
To use them, first login to Azure manually. Then, use the new Save-AzureRmContext to save your information to a file.
# Setup – First login manually per previous section
# Now save your context locally (Force will overwrite if there)
$path = "C:\Azure\PS\ProfileContext.ctx’
Save-AzureRmContext -Path $path -Force
Once that’s done, from then on you can use the Import-AzureRmContext to automate the login.
# Once the above two steps are done, you can simply import
$path = ‘C:\Azure\PS\ProfileContext.ctx’
Import-AzureRmContext -Path $path
Be warned, this does present a security issue. If someone were to steal your context file, they could then login as you. You need to be sure your context file is stored in a safe location no one can get to.