Search Results

Search found 27337 results on 1094 pages for 't sql'.

Page 815/1094 | < Previous Page | 811 812 813 814 815 816 817 818 819 820 821 822  | Next Page >

  • Denormalization Strategies

    In building a database, typically we want a well normalized design. However there are cases for considering options for denormalization in complex systems. Timothy Claason gives you some thoughts on the subject.

    Read the article

  • Analysing Indexes - reducing scans.

    - by GrumpyOldDBA
    The whole subject of database/application tuning is sometimes akin to a black art, it's pretty easy to find your worst 20 whatever but actually seeking to reduce operational overhead can be slightly more tricky. If you ever read through my analysing indexes post you'll know I have a number of ways of seeking out ways to tune the database. -- This is a slightly different slant on one of those which produced an interesting side effect. -- We all know that except for very small tables avoiding...(read more)

    Read the article

  • Excel 2013 Data Explorer and GeoFlow make 3-D maps quick and easy

    - by John Paul Cook
    Excel add-ins Data Explorer and GeoFlow work well together, mainly because they just work. Simple, fast, and powerful. I started Excel 2013, used Data Explorer to search for, examine, and then download latitude-longitude data and finally used GeoFlow to plot an interactive 3-D visualization. I didn’t use any fancy Excel commands and the entire process took less than 3 minutes. You can download the GeoFlow preview from here . It can also be used with Office 365. Start by clicking the DATA EXPLORER...(read more)

    Read the article

  • SSRS 2008 R2 KPIs with bullet graphs

    Key Performance Indicators are typically displayed in a scorecard with stop light indicators, which are either red, amber or green light icons. The limitation for these kind of indicators is that you can see the actual and target values in two different fields as well as see the status of the KPI in red, amber or green color. If the user wants to figure out the thresholds associated with the KPI, these values are generally not visible. Further, representing the threshold values in the scorecard itself defeats the purpose of the scorecard. The scorecard should display the KPI's status in the most summarized form and use a minimal amount of space on the dashboard. In this tip we would look at how to address this issue.

    Read the article

  • Windows Server 2003 network boogey men every DBA should know

    - by merrillaldrich
    Recently I was again visited by my old friends TCP Chimney and SynAttackProtect . (Yeah, sometimes I feel like I mostly blog about 5-year old problems, but many of us as DBA's have to work on older versions or older systems, and so repeat older problems :-). This has been written about before, but as I BinGoogled around I noticed you are more likely to find the documents if you search for the cause, and not the symptoms. Most people who face a problem, of course, know the symptoms but not the cause....(read more)

    Read the article

  • Creating a Corporate Data Hub

    - by BuckWoody
    The Windows Azure Marketplace has a rich assortment of data and software offerings for you to use – a type of Software as a Service (SaaS) for IT workers, not necessarily for end-users. Among those offerings is the “Data Hub” – a  codename for a project that ironically actually does what the codename says. In many of our organizations, we have multiple data quality issues. Finding data is one problem, but finding it just once is often a bigger problem. Lots of departments and even individuals have stored the same data more than once, and in some cases, made changes to one of the copies. It’s difficult to know which location or version of the data is authoritative. Then there’s the problem of accessing the data. It’s fairly straightforward to publish a database, share or other location internally to store the data. But then you have to figure out who owns it, how it is controlled, and pass out the various connection strings to those who want to use it. And then you need to figure out how to let folks access the internal data externally – bringing up all kinds of security issues. Finally, in many cases our user community wants us to combine data from the internally sources with external data, bringing up the security, strings, and exploration features up all over again. Enter the Data Hub. This is an online offering, where you assign an administrator and data stewards. You import the data into the service, and it’s available to you - and only you and your organization if you wish. The basic steps for this service are to set up the portal for your company, assign administrators and permissions, and then you assign data areas and import data into them. From there you make them discoverable, and then you have multiple options that you or your users can access that data. You’re then able, if you wish, to combine that data with other data in one location. So how does all that work? What about security? Is it really that easy? And can you really move the data definition off to the Subject Matter Experts (SME’s) that know the particular data stack better than the IT team does? Well, nothing good is easy – but using the Data Hub is actually pretty simple. I’ll give you a link in a moment where you can sign up and try this yourself. Once you sign up, you assign an administrator. From there you’ll create data areas, and then use a simple interface to bring the data in. All of this is done in a portal interface – nothing to install, configure, update or manage. After the data is entered in, and you’ve assigned meta-data to describe it, your users have multiple options to access it. They can simply use the portal – which actually has powerful visualizations you can use on any platform, even mobile phones or tablets.     Your users can also hit the data with Excel – which gives them ultimate flexibility for display, all while using an authoritative, single reference for the data. Since the service is online, they can do this wherever they are – given the proper authentication and permissions. You can also hit the service with simple API calls, like this one from C#: http://msdn.microsoft.com/en-us/library/hh921924  You can make HTTP calls instead of code, and the data can even be exposed as an OData Feed. As you can see, there are a lot of options. You can check out the offering here: http://www.microsoft.com/en-us/sqlazurelabs/labs/data-hub.aspx and you can read the documentation here: http://msdn.microsoft.com/en-us/library/hh921938

    Read the article

  • SSIS - Range lookups

    - by Repieter
      When developing an ETL solution in SSIS we sometimes need to do range lookups in SSIS. Several solutions for this can be found on the internet, but now we have built another solution which I would like to share, since it's pretty easy to implement and the performance is fast.   You can download the sample package to see how it works. Make sure you have the AdventureWorks2008R2 and AdventureWorksDW2008R2 databases installed. (Apologies for the layout of this blog, I don't do this too often :))   To give a little bit more information about the example, this is basically what is does: we load a facttable and do an SCD type 2 lookup operation of the Product dimension. This is done with a script component.   First we query the Data warehouse to create the lookup dataset. The query that is used for that is:   SELECT     [ProductKey]     ,[ProductAlternateKey]     ,[StartDate]     ,ISNULL([EndDate], '9999-01-01') AS EndDate FROM [DimProduct]     The output of this query is stored in a DataTable:     string lookupQuery = @"                         SELECT                             [ProductKey]                             ,[ProductAlternateKey]                             ,[StartDate]                             ,ISNULL([EndDate], '9999-01-01') AS EndDate                         FROM [DimProduct]";           OleDbCommand oleDbCommand = new OleDbCommand(lookupQuery, _oleDbConnection);         OleDbDataAdapter adapter = new OleDbDataAdapter(oleDbCommand);           _dataTable = new DataTable();         adapter.Fill(_dataTable);     Now that the dimension data is stored in the DataTable we use the following method to do the actual lookup:   public int RangeLookup(string businessKey, DateTime lookupDate)     {         // set default return value (Unknown)         int result = -1;           DataRow[] filteredRows;         filteredRows = _dataTable.Select(string.Format("ProductAlternateKey = '{0}'", businessKey));           for (int i = 0; i < filteredRows.Length; i++)         {             // check if the lookupdate is found between the startdate and enddate of any of the records             if (lookupDate >= (DateTime)filteredRows[i][2] && lookupDate < (DateTime)filteredRows[i][3])             {                 result = (filteredRows[i][0] == null) ? -1 : (int)filteredRows[i][0];                 break;             }         }           filteredRows = null;           return result;     }       This method is executed for every row that passes the script component. This is implemented in the ProcessInputRow method   public override void Input0_ProcessInputRow(Input0Buffer Row)     {         // Perform the lookup operation on the current row and put the value in the Surrogate Key Attribute         Row.ProductKey = RangeLookup(Row.ProductNumber, Row.OrderDate);     }   Now what actually happens?!   1. Every record passes the business key and the orderdate to the RangeLookup method. 2. The DataTable is then filtered on the business key of the current record. The output is stored in a DataRow [] object. 3. We loop over the DataRow[] object to see where the orderdate meets the following expression: (lookupDate >= (DateTime)filteredRows[i][2] && lookupDate < (DateTime)filteredRows[i][3]) 4. When the expression returns true (so where the data is between the Startdate and the EndDate), the surrogate key of the dimension record is returned   We have done some testing with this solution and it works great for us. Hope others can use this example to do their range lookups.

    Read the article

  • Parameterize Charts using Excel Slicers in PowerPivot

    - by Marco Russo (SQLBI)
    One new nice feature of Excel 2010 is the Slicer. Usually, slicers are used to filter data in a PivotTable. But they might be also useful to parameterize an algorithm or a chart! We discussed this technique in our book , but Alberto Ferrari wrote a post that shows how to use this technique to allow the user to select two stocks that should be compared in an Excel Chart – as you might imagine, this will work also when you will publish the workbook on SharePoint! This is the result: Nice to see that...(read more)

    Read the article

  • Education and Career Resources from Microsoft and the Community

    - by KKline
    Sometimes I'm timely in getting the news out on useful resources. And, other times, I'm a bit slower on the draw. As I told my friends back at New Year's Day, "As an official member of the Procrastinators Club, welcome to 2008!" On the other hand, it's always good to remind folks of great resources that are still available and on the shelf. Why? Well, the Internet hits us with such a deluge of constantly new material, that we often forget about the old(ish) stuff that's still really useful. Darth...(read more)

    Read the article

  • Many-to-many relationships in pharmacology

    - by John Paul Cook
    When I was in my pharmacology class this morning, I realized that the instructor was presenting a classic relational database management system problem: the many-to-many relationship. He said that all of us in nursing school must know our drugs backwards and forwards. I know how to model that! There are so many things in both healthcare and higher education that could benefit from an appropriate application of technology. As a student, I'd like to be able to start with a drug, a disease, a name of...(read more)

    Read the article

  • BI Evening @ Hitachi Consulting

    - by tsutha
    Next BI Evening is hosted by Hitachi Consulting. Please register ASAP as there are only limited number of spaces available. Of course there will be free beer and pizza. Great place to network with industry experts. If you are looking for job don't feel shy to talk to us . ThanksSutha

    Read the article

  • Terrible Performance with SATA Drives on Dell PowerEdge, steps to troubleshoot?

    - by Tom
    I had asked this question earlier and the question went missing so here it is again. Bought a DELL Poweredge 2950 to use as in-house QA Server. Disk performance is beyond terrible, 1000-4000 ms response time on the drive with our SQL Server database .mdf. Sql Server disk queue upwards of 300 at times. I'm a software guy, can anyone help me with steps to determine the issue? I don't know what RAID controller it has, how can I determine that? I'm speculating it could be BIOS issue. Perhaps the server used to have another kind of drive in it and when I added SATA the ??? buffer size is wrong??? Perhaps I chose wrong options (chose defaults) when setting up the RAID 1 arrays? I thought RAID 1 was a performance array?

    Read the article

  • Fixed Bid vs. T&amp;M &ndash; Take 2

    - by AjarnMark
    One of my most popular blog entries of all time is my Contracting Tips: Fixed Bid vs. T&M post from January, 2004.  This post consistently shows up in my referrers list, usually coming from a search engine.  Recently, Brent Ozar (@BrentO) wrote a great argument for why he always bills by the hour (a.k.a. Time & Materials or T&M) which itself was a response to Mark Richman’s (@mrichman) post on why he never bills by the hour (fixed bid).  Each article has good arguments, and I encourage you to read them both and choose the best approach for you. As for me, my experience parallels Brent’s and I historically have leaned toward the Time & Materials model.

    Read the article

  • PASS Summit '12, Day One

    - by AaronBertrand
    I had an incredibly interesting experience getting to Seattle this week. I flew out of Providence through Philadelphia. Apparently there was some smoke in one of the towers at PHL, so our flight was an hour delayed. I missed my connection by three minutes . I was absolutely amazed that after a one-hour, full ground stop, flights shortly afterward were leaving exactly on time. It was like anti-Aaron magic. I got to the gate and watched my plane back away. My luggage never would have made it but it...(read more)

    Read the article

  • Cell Transitions in Excel 2013 Preview–Fixed

    - by simonsabin
    If you’ve downloaded Excel 2013 and been working with it you may have noticed the new cell transition feature. Not sure why they put it in, it feels a bit like the aero interface which I understand has been dropped in windows 8. What you may have found is that the transition is buggy, Excel hangs, of the transition is jumpy. Well I found the fix on http://answers.microsoft.com/en-us/office/forum/office_home-excel/hardware-acceleration-problem-with-excel-2013/894da202-48c0-4442-a371-955587c1b7c0 For...(read more)

    Read the article

  • General High-Level Assessment

    - by tcarper
    Guys and Gals, I've been tasked with a doozy of an assignment. The objective is something akin to "laying of hands" on several database servers which work in concert to provide data to various Web, Client-Server and Tablet-Sync'd distributed Client-Server programs. More specifically, I've been asked to come up with a "Maintenance Plan" which includes recommendations for future work to improve these machines' performance/reliability/security/etc. Might there be some good articles on teh interwebs ya'll could point me towards which would give me some good basis to start? Articles describing "These are the top 4 overarching categories and this is how you should proceed when drilling down on each of them" sort-of-thing would be fabulous. The Databases are all SQL 2005, however the compatibility level is 80 and they were originally created with ERwin based on SQL 6.5. The OSs are all Windows Server 2003. Thanks all! Tim

    Read the article

  • SSIS Denali CTP1 Source Assistant

    - by andyleonard
    I like the new Data Flow Source Assistant in SSIS Denali. The default view is shown above, with the "Show installed only" checkbox checked. When not checked, the list of Source types changes: In previous versions of SSIS, I rarely created connections in the Connection Managers pane - I usually hit a New button in either a Source or Destination Adapter, or in a task. It was just easier letting the task or adapter pick the proper Connection Manager editor. This is handy and a time-saver. :{>...(read more)

    Read the article

  • PowerPivot: editing measures when you reach 45

    - by AlbertoFerrari
    I have always been used to small fonts but now, as I am getting older, I’d better admit that a greater font is much more relaxing. Editing PowerPivot measures has always been a pain, since all you have available is a small text box and I hate to admit that I got used to leverage ZoomIt for a long time to edit measures. Today I ran into a great Windows feature that I did not know about: ctrl-wheel on the mouse inside a textbox increases the font size of the text box. It seems to work with most textboxes...(read more)

    Read the article

  • OT: Improbable use for an iPad?

    - by merrillaldrich
    Here's an interesting tidbit: I have noticed an even more pronounced trend toward centralized or virtual workstations lately. Both my wife and I can sit at home, as we are now, at the dining room table and work on our laptops (exciting life, I know!) but both of us are not actually working locally on these machines. We are both remoting into machines at our respective workplaces. Hers is a desktop machine physically located at her desk, while mine is a virtual workstation in my company's data center...(read more)

    Read the article

  • Thinking in DAX (#powerpivot and #bism)

    - by Marco Russo (SQLBI)
    Last week Alberto published an interesting post about Counting Products in the Current Status with PowerPivot . Starting from a question raised from a reader, Alberto described how to solve a common issue (let me know the “current status” of each item at a given point in time starting from a transactions table) by using a single DAX formula. I suggest you to read his post to understand the technical details of that. What is inspiring of this example is that we can look at Vertipaq and DAX from several...(read more)

    Read the article

  • DTLoggedExec 1.1.2008.4 Released!

    - by Davide Mauri
    Today I've relased the latest version of my DTExec replacement tool, DTLoggedExec. The main changes are the following: Used a new strategy for version numbers. Now it will follow the following pattern Major.Minor.TargetSQLServerVersion.Revision Added support for Auto Configurations Fixed a bug that reported incorrect number of errors and warnings to Log Providers Fixed a buf that prevented correct casting of values when using /Set and /Param options Errors and Warnings are now counted more precisely. Updated database and log import scripts to categorize logs by projects and sections. E.g.: Project: MyBIProject; Sections: Staging, Datawarehouse Removed unused report stored procedures from database Updated Samples: 12 samples are now available to show ALL DTLoggedExec features From this version only SSIS 2008 will be supported http://dtloggedexec.codeplex.com/releases/view/62218  It useful to say something more on a couple of specific points: From this version only SSIS 2008 will be supportedYes, Integration Services 2005 are not supported anymore. The latest version capable of running SSIS 2005 Packages is the 1.0.0.2. Updated database and log import scripts to categorize logs by projects and sectionsWhen you import a log file, you can now assign it to a Project and to a Section of that project. In this way it's easier to gather statistical information for an entire project or a subsection of it. This also allows to store logged data of package belonging to different projects in the same database. For example:  Updated SamplesA complete set of samples that shows how to use all DTLoggedExec features are now shipped with the product. Enjoy! Added support for Auto ConfigurationsThis point will have a post on its own, since it's quite important and is by far the biggest new feature introduced in this release. To explain it in a few words, I can just say that you don't need to waste time with complex DTS configuration files or options, since a package will configure itself automatically. You just need to write a single statement as a parameter for DTLoggedExec. This feature can simplify deployment *a lot* :)   I the next days I'll write the mentioned post on Auto-Configurations and i'll update the documentation available on theDTLoggedExec website:   http://dtloggedexec.davidemauri.it/MainPage.ashx

    Read the article

  • PASS Summit 2012 Women In Technology Luncheon

    - by AllenMWhite
    My final stint at the Summit Blogger's Table(tm) is for the annual WIT luncheon. I do appreciate the honor that PASS conferred on me by inviting me to the "table" for the event, it's been a lot of fun (even if there were some moments that weren't.) Newly-elected board member Wendy Pastrick is the MC for this year's luncheon, and the panel consists of Stefanie Higgins, Denise McInerny, Kevin Kline, Jen Stirrup and Kendra Little. I'm pleased to say that I know each one of them except Stefanie Higgins,...(read more)

    Read the article

  • PASS Summit Location follow up - result analysis

    - by simonsabin
    I've had a chance to look at the results directly and it is clear that there is a tough choice. On the one hand people are saying that they prefer to have PASS put their money into chapters and things like 24hrs of PASS rather than an event on the east coast. Whilst at the same time almost 50% more people said they would be more likely to attend an East Coast event than a Seattle event, and 60% more said they would be more likley to attend a US Central region event. Whats more 60% said that the summit should be outside of Seattle every other year with only 19% saying it should always stay in Seattle. So clearly there is a huge desire for a non Seattle event. Looking at the other reasons for keeping in Seattle and the big one being that people want Microsoft speakers. More people think its somewhat important of very important that the conference is in walking distance of the hotels and restaurants. Essentially the Q6 questions show an even balance for normal conference, highlighting that they are prepared to travel, not with the family and they want a well laid out conference. Whats very annoying is that the questions, as people have commented, were biased towards certain answers. For instance there was no option about whether people feel its important to have industry leading speakers, MVPs etc at the conference. Only questions about Microsoft speakers. I know survey writing is very difficult to avoid biasing the answers one way or another. There was also no choice to show peoples preference, would people prefer Microsoft speakers or the summit to be held on the East Coast/Central US. I also find it amazing that people prefer hundres of developers rather than the SQLCAT and CSS teams, surely that indicates another issue about a lack of understanding of what the these teams do. All in all it is clear that people showed they want an event outside of Seattle and don't want PASS to be putting money into that instead of into other community activites. I find it suprising that there appears to have been a huge weighting against certain questions which have prioritised them over the huge desire for a PASS summit outside of Seattle. Lets see where we will be in 2013 or maybe they will rethink 2012 who knows.

    Read the article

  • Batch Script With SQLCMD Usage

    - by user52128
    Hi All I am Writing a Batch Script Which has to read a set of SQL Files which exists in a Folder then Execute Them Using SQLCMD utiliy. When I am Trying to execute it does not create any output file. I am not sure where I am wrong and I am not sure how to debug the script. Can someone help me out with script? @echo off FOR %F IN (C:\SQLCMD\*.SQL) DO sqlcmd -S LENOVO-C00 -U yam -P yam!@ -i %F -o C:\SEL.txt -p -b IF NOT [%ERRORLEVEL%] ==[0] goto get_Error :Success echo Finished Succesffuly exit /B 0 goto end :get_error echo step Failed exit /B 40 :end

    Read the article

  • mysql single database relocation

    - by asdmin
    I would like to know if it's possible to operate different databases on different filesystem locations. Background: we are a hosting service, which hosts mysql, web, and smtp to it's customer, but all our services (sql, smtp, http) are located in a different place. We are going to assign a single logical volume to a customer, which will accommodate the customer's mailing, weppages and (hopefully) sql database. Web pages and mailing are already covered, but I am not able to find a configuration setting which would enable me to specify the location of a database (the directory where mysql stores the DB). Let me please highlight, the target here is to relocate different databases to different locations in the filesystem, not moving them from a single place to an another (single) place. Also please do not bother answering with soft and hard symbolic links. ;) Thanks

    Read the article

< Previous Page | 811 812 813 814 815 816 817 818 819 820 821 822  | Next Page >