Tuesday, February 03, 2015

SQL Saturday ALBER..ALBUq...ALBURQUERQUE... New Mexico

I was thrilled to get the email confirming that I was selected to speak at SQLSaturday #358 on February 07, 2015. 

Its been two years since I went to New Mexico, as I then had the honor of speaking at the inaugural event. I am happy to return. Last year, family got in the way of my attending. I cant remember what i had to do instead, but I am sure it was the right decision to hang with my family then.

This time around, I am bringing my 15yr old daughter with me. She is learning how to drive, and a road trip is one of the ways we solve this dilemma in my family. So why not make a 9 hour trek south to New Mexico and back? 

We will be leaving midday on Thursday, driving 5 hours, and spending the night in Utah, near the border. The next day will be Friday, and we will get up and go see four corners area, which we have never seen before. Then travelling to New Mexico. Once there, im sure we'll hit the hotel, and then make our way to the speaker dinner. I love attending these little get togethers, so that we can see and say hi to our #sqlfamily. Hugs and introductions to my daughter will be in order. And food. 

The next day will find us both attending the SQL Saturday event. I will force my daughter to attend my session, just so there are at least 3 of us in there. Me, her, and the room proctor. 
She will hang out and maybe go with me to sessions, or maybe run the PASS booth. I will spend some time behind the table myself, helping out the region that I used to mentor as an RM for PASS. This recently changed, and I lost this area as one of my responsibilities. I am sad to miss out on continued interactions with these folks. But will do so from the #sqlfamily perspective instead.

At my session we will be talking about documentation. And I would love to assign you homework, if you are going to be in this session. Bring to me some piece of a potential document, and lets work on it either in the session or after, and apply the techniques we'll discuss. Documenting what we do is so vital, and has become an important part of my job. I dont necessarily love to document, but I do love when it has been done properly. 

After the event, I am not sure yet if we will be attending the after party. I will probably let my dear daughter decide this. It will have been a long day of potential boredom for her. 

The next day, we will make the 9 hour journey back home. Trading off in the duties of driving, and helping her learn how to drive. It will be exciting ans scary. If you are of the persuasion that prays, please keep us in mind, since we almost die everytime she gets behind the wheel. Just kidding, she is not that terrible, but still, its a bit scary, I'll admit.

I took my son with me to San Diego SQL Saturday, and we made a trip out of it. At the tail end of PASS, my wife came to visit me, and we made a trip out of it as well. I took my oldest daughter to Austin this last weekend. And this next weekend will find me in Albuquerque, with my other child. We are making the most out of these events, combining work and family, #sqlfamily and SQL. 

I look for to these events, to sharing my knowledge and experiences, to learning from you, and seeing my #sqlfamily. 

Tuesday, January 27, 2015

SQL Saturday Austin 2015

This weekend is the Austin SQL Saturday event. I blogged about it here previously. You can find out more about the event here.

Some of my favorite #SQLFamily will be in attendance. As well as some new members, I hope. As you attend these events, make sure to reach out to the speakers for more information. But not just them, reach out to fellow attendees. Even Volunteers. Even organizers. Basically, reach out to any and all folks that can help you with your career and current problems.

Don't just seek out the like minded folks or those that can help with immediate needs. If you are an Administrator, spend some time talking to a BI person. And vice a versa. Just reach out and talk to them. Hand them a business card (you did remember to bring those didn't you?) and start building your network.

While I am in the area, please come and talk to me. I like talking to people. I have tons of stories to share. I may even be able to help you, or even better, recruit you to help me.

See in Texas.

Friday, November 21, 2014

SQL Saturday Austin

I was thrilled to get the email confirming that I was selected to speak at SQLSaturday #362 on January 31, 2015. 

The last time I was in Austin was 2011 for SQL Saturday 97. My wife and I flew out there and hung out with my Sister and her family while I attended the event. My wife had a great time hanging out in Texas, a first for her. And we both enjoyed the weekend with my sister. 

This time around, I am bringing my 18yr old daughter with me to hang out with my sister. Hopefully she won't corrupt her. She being my sister, and her being my daughter. 

Ever since my oldest came into the world, she has had a special affinity for her aunt, my sister. The two of them share a lot of traits and even resemble each other. Sadly, we have not lived close enough to entertain the togetherness that these two have craved for so many years. But I hope to help with that, this trip.

We will fly out Thursday before the event, and this will give my daughter all Thursday night, all day Friday and Saturday, along with most of the day Sunday to hang out with her doppelganger. I hope they have a blast, get into a ton of trouble and bond over everything they can in the short amount of time we can provide them. 

In the meantime, I will go to the speaker dinner and spend most of Saturday hanging out with my #SQLFamily at the event. The first time I was here, it was my fourth SQL Saturday event. This time it'll be like my 16th or something. I try to hit a few each year, but never more than 5, as it is difficult to justify a quantity more than that with my current situation. I would love the opportunity to go to more. But will work with what I have been given. 

The last event I attend outside my state, i took my son with me, and we made a trip out of it. At the tail end of PASS, my wife came to visit me, and we made a trip out of it as well. Austin will allow me the opportunity to bring my oldest daughter. And the very next weekend will find me in Albuquerque, having driven there with my other child. So, we are making the most out of these events, combining work and family, #sqlfamily and SQL. 

I look for to these events, to sharing my knowledge and experiences, to learning from you, and seeing my #sqlfamily. 

Wednesday, October 29, 2014

PASS Summit 2014 goals - crowd source solving a problem

I've not really done this before. They've usually been rattling around in my head though.
I tend to attend the conference and seek out folks that i know can add to my knowledge store or help resolve issues i am having currently.

I'm going to describe some of them here in this post, and hope that you, dear reader, can assist me with tracking them down.

I am not a BI guy, but I would like to become more of one.

One of the current projects we are working on is getting a cloud based version of our application to work well. It is the basic application we have now, only instead of getting data from an on premise sql server, its gonna reach into an azure blob storage and retrieve a json document. This rich document contains all the data we normally have, without the constraint of structure in an rdbms. But it will mimic this structure. So there will be a document describing a student, and within that, there will be names and the like. Another document within this json doc will have scores and history and other descriptive data detailing what this student has done. Think 20 tables with 10-15 fields per table, and many rows of data (or documents in this case). These json docs will grow and grow with use, as more data is added.

How do I get this data out of these json documents, and into a system where others (internal, external, application, services) can get to the data? Think for reporting. Each json doc is a single student. Maybe i need to summarize all the students from an entire region and gain insight on something they have done. I would assume that these individual rich json docs will need to be extracted to some other structure, and transformed and loaded to a storage system, be it an rdbms, or hadoop cluster, or some other magical solution. Maybe a data warehouse. Maybe a SQL Server. Maybe a MondoDB store.

What do you think it should be? How best to create this process of extracting this data and presenting it to others in a reportable fashion?

So if you are still reading this, and have an opinion on this above issue, come grab me and let's talk about it. I'd love to hear your take on this, and experience, and possible direction.

With this in mind, there are a slew of sessions that are BI related that I am going to try hard to attend this year. I have flagged some of them and hope to get some insight there, as well as with other groups like SQLCAT and the other forums and opportunities that the PASS Summit offers. I'll even sit down with vendors and spell this out in the hopes that they have a magic bullet or at least a suggestion of direction.

I suggest that you too bring issues, real issues, to the summit and attempt to get them solved. At the very least, by you talking through the issue with others, you will discover things along the way. Maybe you will solve it by yourself, or maybe you'll get put in touch with that solitary individual in the world that has already solved it and is willing to train you on the process. Or something in between. Either way, its better than sitting at your desk making it up yourself.

Get out and get your lurn on at the summit this next week and enjoy yourself.

Wednesday, September 24, 2014

PASS Elections 2014: my thoughts

I voted.

I think you should vote also. Go vote.

See ya at summit!

Monday, June 30, 2014

On Speaking

Back in 2004, I attended my first PASS Summit conference. I was not as integrated into the community back then, and only knew a few folks. I met some great folks on that adventure, some i still know today and some that have become quite celebrities in the community. I attended because a local user group leader encouraged me to attend, as he encouraged me to do a great many things. One of which was to speak at the local User Group. I was frightened, and exhilarated.

Having grown up in a religion that encourages its members to speak, I have had various opportunities throughout my life to stand in front of an audience and speak. I did so as a youth many times, also as a missionary, and as an adult. One in particular will always remain fresh in my mind. Usually as a youth, after speaking to the congregation, you get the 'good job' and 'atta boys' doled out by members of your congregation. They do it to make you feel better, since you probably looked terrified while in front of them. I received some of these while at church. But after church, members called me at home to congratulate me. This was when phones were not as easily accessible as today, so it seemed like an extra effort to call a child and buoy them up for a job well done. I was elated. And terrified. It meant that I had a knack for speaking, that I didnt appear to be as terrified as I felt, and that meant that I oughta continue practicing the talent, or it would become latent. So I did. I never really shied away from opportunities to speak.

Fast forward to the User Group leader who asks me to speak. I say yes. I give it a shot. I am thrilled at being selected to speak, and humbled at the task, as well as terrified. But I pull it off and get compliments. Not overflowing, but not under. A sweet spot. So I repeat at the next chance.

As my career marches on, more opportunities have presented themselves. I once spoke at a Microsoft event to a large room full of attendees, following the lesson notes they had outlined, teaching about Analysis Services. I had never used it before for real, but here I was teaching it to a crowd. I did OK. Not great, but not bad enough to recognize in myself the lack of this talent, or the subsiding of talent such that I should quit. I did OK.

I remember being asked to fly to Arizona to record some sessions for a training company. I was thrilled again. Humbled. Worried. Scared. Thrilled. Me, they picked me. I did it, and it was OK. I did good enough to be asked back on multiple occasions.

When it came time for our own community events here in my home state, I volunteered to not only speak, but to help out organize and do tasks and whatnot to help the event be accomplished. I have done this every chance I get, and have even volunteered at regional events. At this same time, I volunteer in various capacities in PASS. Each time I get to do a task, a part of me is thrilled that I was picked. I was allowed to give some of me to something bigger than me.

After a while, I notice that I may be expecting to be picked, and when this happens, I need to remind myself of the beginnings, and those feelings of humble come back. The scared and thrilled are ever present. But remembering to be humble at being picked is key. Then the level of thrill can remain high when you are picked.

I have never spoken at the PASS summit, though I have heard some members respond in surprise to this knowledge, as they coulda swore that I had. Nope. I have submitted a couple times, but have not been selected. I have been selected to SQL Saturdays and Code Camps. Once I have been asked to not speak at an event I had submitted to. Besides PASS Summit, I have been accepted to every even I have submitted. Does this make me proud? Yes. But does it mean I deserve it next time? No. I try to temper the pride with the humility of being asked to speak, because remember, that allows the thrilled feelings to be elated.

So, in a word, get over yourself. If you are not selected, you will get over it. There will be another opportunity. If you are selected to speak, awesome. Accept it with humility. Be proud you were selected. Prepare to do as good a job as you can do. Do a better job than you did last time you were given this singular honor. Be scared and thrilled. Realize that no matter how much 'celebrity' you think you are, you are just a person. Treat each opportunity like your first and react accordingly. Don't ever loose that excitement that the first infused into your being. And be scared. And be thrilled.

Monday, June 23, 2014

Can that report run faster?

We have a scenario where a report is causing us some grief. Let's call this report 'Revenue'. Let's assume that this report is shown in a 'Leadership' meeting where revenue is an important topic. Let's assume that the presenter in this meeting shows the 'Leadership' team this report by launching it in Report Manager, along with several other reports. All these reports will be used to discuss points of interest in said meeting.
So put yourself in the seat of one of the observers of said report. It is kicked off, and you watch the little spinny do its thing, while you wait for this report to materialize the data and render it visually for you to consume. You wait. And wait.

How long do you wait before you open up your device and kick off the same report to 'see if you can run it faster' than the presenter? I'm betting a few minutes, maybe 4-5. And if you do that, who else in the room is doing the same thing? Let's assume more than one individual does this, which can cause the other executions to slow, or at least causes your execution to slow.

This scenario played out a few weeks ago. Presenter kicked off report, which timed out after 8 minutes or so, but at the same time, 4 other individuals also kicked off the report. It was in the middle of these executions that I was contacted to look at said report, and see if I could make it faster. Later that day, as everyone else stopped executing it, the speed of execution dropped to a couple minutes, under 2 minutes. But during the peak of multiple executions, it was taking 8 minutes. (also, that morning, 4 times the normal quantity of reports were being generated. 4 times. as in 4000 reports, instead of 1000 reports)

Over the past few weeks, I have been digging at this report and looking for solutions to its speed issue. It seems that when normal circumstances prevail, this report can retrieve and render its data in under a minute. That seems acceptable. But under load, it takes time, and causes frustration in the team that needs to consume it.

This report is written in SSRS and hits our production reporting system that is a SQL Server 2008 R2 system that utilizes several FusionIO drives to speed things along. It is a nice machine and is much faster than our old reporting system. The data seems to be properly indexed. The queries seem to function as expected. So I jumped into caching and snapshotting to investigate alternatives to 'speeding up the report'.

Let's look at the past execution of this report to see what we are dealing with. Looking into the ExecutionLog view allows us a view into what has transpired with this particular report. I format the time to perform the tasks, and convert it to minutes, so i don't have to think too hard. I have to join to the Catalog to limit the results to the actual report that I want to see. And for sorting, let's order it by the time the report started in descending order to see the latest executions to the older ones.

  2.     c.[Name],
  3.     (TimeDataRetrieval + TimeProcessing + TimeRendering)/60000.00 AS [Time],
  4.     [RequestType],
  5.     [TimeStart]
  6.   from ReportServer.dbo.ExecutionLog el
  7.     JOIN ReportServer.dbo.Catalog c ON c.ItemID = el.ReportID  
  8.   WHERE c.[Name] = 'revenue'
  9.   ORDER BY [TimeStart] desc;
This query shows me a lot of good information. I can see how many times the report was executed, how long it took, and so on. When I did this for the time frame of issues, I see exactly what I expected. A ton of executions taking a long time, and many many within the same time frames. I can expand this query to add other fields, like UserName to see who executed the report, status to see success or otherwise, among other. Use the fields you want to tell the story of the data. My story showed me that a few individuals had executed the report repeatedly and upon failure, attempted again, and again. As I mentioned, later in the day, it sped up, once everyone stopped hitting it incessantly.

I went to SSRS Report Manager to drill down to the properties of this report. On the 'Snapshot Options' tab, I enabled 'Allow report history to be created manually' and 'Store all report snapshots in history' checkbox items. In the 'Report History' tab, I saw nothing, as expected. I planned on revisiting this tab to see the progress of adding caching and snapshotting. On the 'Processing Options' tab I chose 'Cache a temporary copy of the report. Expire copy of report after a number of minutes:' and chose 240 as the minutes. I also took this opportunity to select the 'Do not timeout report' option, hoping to allow the slow report to simply continue, instead of eventually dying, which prompts others to try the same thing. And fail.
The last option that I did was to setup a cache plan on the 'Cache Refresh Options' tab. I chose 2 plans, after careful review of the executions seen from the query above. This query helped me see the times that were executed, and when a cache would be most profitable to this report.

With these 2 cache plans created, and the rest of the options selected, I was happy that the report would run better next week. As Monday rolled around, I was interested to see the results. So I grabbed the above query and executed it mid day Monday, only to encounter report executions. Meaning I could only see individuals that had fired off this report, and the time it took them to render. (which was fast). I was unable to see cache executions. I would have thought that there would be an execution at the times indicated in my cache plan, and I could see how long those took. But this query failed me. To the twitterverse!!!

After receiving some help from my #SQLFamily via @markvsql, @jasonhorner, @RateControl, and @tameraclark, I was able to create this query.

  1. SELECT 
  2.     ItemPath,
  3.     (TimeDataRetrieval + TimeProcessing + TimeRendering)/60000 AS [Time],
  4.     [RequestType], 
  5.     [TimeStart]
  6.   from ReportServer.dbo.ExecutionLog3 el
  7.   WHERE [ItemPath] like '%revenue%'
  8.   ORDER BY [TimeStart] desc

                  This uses ExecutionLog3, which has different fields in its view. Happily, some of these fields are exactly what I need. I can add into the where clause options to show 'Interactive' (ran by a human) or 'Refresh Cache' (ran by the caching mechanism) and see how long each took.

                  Looking at these results showed that the cache took longer than the interactives, as expected. But when you compare the 'Interactive' times to pre cache times, the results are astounding. The caching takes a chunk of time, and then subsequent executions are tremendously smaller in execution time. Meaning, let the system create the cache on its cache plan schedule, and then the humans can simply run the report, and get the cached results quickly. No more multiple executions. We went down from 28 executions that never seemed to complete, to 4-5 that are a lot faster.

                  After a few Mondays, and reviews of the resultant data, I believe I have narrowed it down to the appropriate options. I kick off the cache about a half hour before the meeting on Monday morning, and then again at noon. This allows for caching of the report data twice on a Monday, allowing the 'Leadership' meeting to have quick report executions, and cached data in the morning, and the noon cache allows for other teams to utilizes a fresher cache in their afternoon meetings. But as soon as the cache reaches 240 minutes, it expires, which reverts users back to non cached data retrieval. The rest of the week the report should run quickly in the random times it needs to run.

                  Point of this story is that it is important to use data to make data faster. And not a single solution is better than the other, and the appropriate solution may be something you are unfamiliar with. I did not know or utilize caching in reports much prior to this. I would have normally added indexes and tweeked the queries used to perform the data retrieval, trying to squeeze all the juice out of them that I could. But a simpler solution was to understand the use of this report, compare to other non peak time of use, and adjust appropriately.

                  I believe that this solution is valuable, and suits my needs, and satisfies the users of this report. Thus I share it with you in the hopes you too can learn as I did.