Search Results

Search found 10442 results on 418 pages for 'blog'.

Page 119/418 | < Previous Page | 115 116 117 118 119 120 121 122 123 124 125 126  | Next Page >

  • New White Papers Available

    - by mattande
    New Master Data Services white papers are now available on MSDN. For an application-agnostic overview of Master Data Management, see Organizational Approaches to Master Data Management . For the steps needed to configure Master Data Services to work with a SharePoint workflow, see SharePoint Workflow Integration with Master Data Services . Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!...(read more)

    Read the article

  • Is the Internet Making us Smarter or Not?

    - by BuckWoody
    I’ve been reading recently about an exchange among some very bright folks, some who posit that the Internet with its instant-on, sometimes-right, big-statement-wins mentality is making people think in a more shallow way, teaching us to rely on others as experts and diluting our logical thought process. Others state that it broadens our perspective and extends our mental reach. Whenever I see this kind of exchange on two ends of a spectrum, I begin to wonder if both sides might be correct.   I can certainly say that I have changed my way of learning, reading, and social interactions because of the Internet. And my tolerance for reading long missives has indeed gone down. I tend to (mentally and literally) “bookmark” things I never seem to have time to get back to. But I also agree that I’ve been exposed to thoughts, ideas and people I never would have encountered any other way. So how to deal with this dichotomy?   Well, I’m going to go off and think about it. No, I’m really going to go off for a full week to a cabin I’ve rented in a National Forest in the Midwest. It has no indoor plumbing, phones, Internet connections or anything else – only a bed to sleep in and a place to cook a little. I’m taking one book, some paper, and a guitar with me and that’s it. I plan to spend my days walking, reading a little, playing a little on the guitar, but mostly just thinking. Those of you who know me might find this unusual. I’m an always-on, hyper-caffeinated, overly-busy, connected person. I haven’t taken a vacation in five years, at least for more than two or three days at a time. Even then, I keep us on the move constantly – our vacations aren’t cruises or anything like that. I check e-mail, post and all that. When I’m not on vacation, I live with and leverage lots of technology, and work with those that do the same. This, however, is a really “unplugged” event, and I’m hoping that it will let me unpack the things I’ve been stuffing in my head. I plan to spend a lot of time on a single subject, writing notes, thinking, and writing more notes.   So after I post tomorrow's “quote of the day” I’ll be “going dark” for a week. No twitter, FaceBook, LinkedIn, e-mail, chat, none of my five blogs will get updated, and I’ll have to turn in my two articles for InformIT.com early. I won’t have access to my college class portal, so my students will be without me for a week. I will really be offline. I’ll see you in a week – hopefully a little more educated. See you then.   Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Is the Internet Making us Smarter or Not?

    - by BuckWoody
    I’ve been reading recently about an exchange among some very bright folks, some who posit that the Internet with its instant-on, sometimes-right, big-statement-wins mentality is making people think in a more shallow way, teaching us to rely on others as experts and diluting our logical thought process. Others state that it broadens our perspective and extends our mental reach. Whenever I see this kind of exchange on two ends of a spectrum, I begin to wonder if both sides might be correct.   I can certainly say that I have changed my way of learning, reading, and social interactions because of the Internet. And my tolerance for reading long missives has indeed gone down. I tend to (mentally and literally) “bookmark” things I never seem to have time to get back to. But I also agree that I’ve been exposed to thoughts, ideas and people I never would have encountered any other way. So how to deal with this dichotomy?   Well, I’m going to go off and think about it. No, I’m really going to go off for a full week to a cabin I’ve rented in a National Forest in the Midwest. It has no indoor plumbing, phones, Internet connections or anything else – only a bed to sleep in and a place to cook a little. I’m taking one book, some paper, and a guitar with me and that’s it. I plan to spend my days walking, reading a little, playing a little on the guitar, but mostly just thinking. Those of you who know me might find this unusual. I’m an always-on, hyper-caffeinated, overly-busy, connected person. I haven’t taken a vacation in five years, at least for more than two or three days at a time. Even then, I keep us on the move constantly – our vacations aren’t cruises or anything like that. I check e-mail, post and all that. When I’m not on vacation, I live with and leverage lots of technology, and work with those that do the same. This, however, is a really “unplugged” event, and I’m hoping that it will let me unpack the things I’ve been stuffing in my head. I plan to spend a lot of time on a single subject, writing notes, thinking, and writing more notes.   So after I post tomorrow's “quote of the day” I’ll be “going dark” for a week. No twitter, FaceBook, LinkedIn, e-mail, chat, none of my five blogs will get updated, and I’ll have to turn in my two articles for InformIT.com early. I won’t have access to my college class portal, so my students will be without me for a week. I will really be offline. I’ll see you in a week – hopefully a little more educated. See you then.   Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Microsoft BI Conference 2011 in Lisbon

    - by AlbertoFerrari
    Anyone interested in BI from Portugal or Spain should not miss the Microsoft BI Conference 2011 in Lisbon : one full day ( March, 25, 2011 ) with three tracks on Business Intelligence: Decision Makers BI pros Intro to BI. I am going to present two sessions on PowerPivot: one is a nice deep dive into DAX for BI pros, the other is about self service BI for decision makers. Titles and the complete agenda will be published in the next days, but I suggest to save the date. The full event is free and it...(read more)

    Read the article

  • New Upgrade Technical Reference for SQL Server 2008 R2

    - by Greg Low
    Hi Folks, A year or two back, I was involved in a project with my colleagues from SolidQ (led by Ron Talmage) to construct an Upgrade Technical Reference for SQL Server 2008. It seemed to be well received. We've updated it now to SQL Server 2008 R2 and it's just been published. You'll find it on this web site: http://www.microsoft.com/sqlserver/en/us/product-info/why-upgrade.aspx You'll need to click on the Upgrade Guide link towards the middle of the RHS under the "Why Upgrade" whitepaper. Enjoy!...(read more)

    Read the article

  • Tracking first visit date in Google Analytics

    - by dusan
    I want track a site's visitor retention using Google Analytics, to see if unregistered users are returning to it, within a time frame of 2+ months from now. This blog post seems to be on the right track, but I want to track unregistered users, so I don't have a "join date" or similar variable at my disposal. This other blog post suggests using all 5 GA custom variables, using the first variable slot on the first week, variable 2 on week 2, etc. This method will allow me to track 5 weeks of visitors. I want to track more than 5 weeks of visitors, so I was thinking on using two custom variables in GA: visitor's first visit date, and visitor's last visit date. How I can save the first visit date? Because if I save another value in the same slot the old value will be overwritten, and I don't know how reliable is to save that variable conditionally (reading the __utmv cookie to check whether the "first visit date" is set, if it isn't set I save the current date)

    Read the article

  • Speaking About SQL Server

    - by AllenMWhite
    There's a lot of excitement in the SQL Server world right now, with the RTM (Release to Manufacturing) release of SQL Server 2012 , and the availability of SQL Server Data Tools (SSDT) . My personal speaking schedule has exploded as well. Just this past Saturday I presented a session called Gather SQL Server Performance Data with PowerShell . There are a lot of events coming up, and I hope to see you at one or more of them. Here's a list of what's scheduled so far: First, I'll be presenting a session...(read more)

    Read the article

  • DAX editor for SQL Server

    - by Davide Mauri
    One of the major criticism to DAX is the lack of a decent editor and more in general of a dedicated IDE, like the one we have for T-SQL or MDX. Well, this is no more true. On Codeplex a very interesting an promising Visual Studio 2010 extension has been released by the beginning of November 2011: http://daxeditor.codeplex.com/ Intellisense, Syntax Highlighting and all the typical features offered by Visual Studio are available also for DAX. Right now you have to download the source code and compile it, and that’s it!

    Read the article

  • StreamInsight 2.1, meet LINQ

    - by Roman Schindlauer
    Someone recently called LINQ “magic” in my hearing. I leapt to LINQ’s defense immediately. Turns out some people don’t realize “magic” is can be a pejorative term. I thought LINQ needed demystification. Here’s your best demystification resource: http://blogs.msdn.com/b/mattwar/archive/2008/11/18/linq-links.aspx. I won’t repeat much of what Matt Warren says in his excellent series, but will talk about some core ideas and how they affect the 2.1 release of StreamInsight. Let’s tell the story of a LINQ query. Compile time It begins with some code: IQueryable<Product> products = ...; var query = from p in products             where p.Name == "Widget"             select p.ProductID; foreach (int id in query) {     ... When the code is compiled, the C# compiler (among other things) de-sugars the query expression (see C# spec section 7.16): ... var query = products.Where(p => p.Name == "Widget").Select(p => p.ProductID); ... Overload resolution subsequently binds the Queryable.Where<Product> and Queryable.Select<Product, int> extension methods (see C# spec sections 7.5 and 7.6.5). After overload resolution, the compiler knows something interesting about the anonymous functions (lambda syntax) in the de-sugared code: they must be converted to expression trees, i.e.,“an object structure that represents the structure of the anonymous function itself” (see C# spec section 6.5). The conversion is equivalent to the following rewrite: ... var prm1 = Expression.Parameter(typeof(Product), "p"); var prm2 = Expression.Parameter(typeof(Product), "p"); var query = Queryable.Select<Product, int>(     Queryable.Where<Product>(         products,         Expression.Lambda<Func<Product, bool>>(Expression.Property(prm1, "Name"), prm1)),         Expression.Lambda<Func<Product, int>>(Expression.Property(prm2, "ProductID"), prm2)); ... If the “products” expression had type IEnumerable<Product>, the compiler would have chosen the Enumerable.Where and Enumerable.Select extension methods instead, in which case the anonymous functions would have been converted to delegates. At this point, we’ve reduced the LINQ query to familiar code that will compile in C# 2.0. (Note that I’m using C# snippets to illustrate transformations that occur in the compiler, not to suggest a viable compiler design!) Runtime When the above program is executed, the Queryable.Where method is invoked. It takes two arguments. The first is an IQueryable<> instance that exposes an Expression property and a Provider property. The second is an expression tree. The Queryable.Where method implementation looks something like this: public static IQueryable<T> Where<T>(this IQueryable<T> source, Expression<Func<T, bool>> predicate) {     return source.Provider.CreateQuery<T>(     Expression.Call(this method, source.Expression, Expression.Quote(predicate))); } Notice that the method is really just composing a new expression tree that calls itself with arguments derived from the source and predicate arguments. Also notice that the query object returned from the method is associated with the same provider as the source query. By invoking operator methods, we’re constructing an expression tree that describes a query. Interestingly, the compiler and operator methods are colluding to construct a query expression tree. The important takeaway is that expression trees are built in one of two ways: (1) by the compiler when it sees an anonymous function that needs to be converted to an expression tree, and; (2) by a query operator method that constructs a new queryable object with an expression tree rooted in a call to the operator method (self-referential). Next we hit the foreach block. At this point, the power of LINQ queries becomes apparent. The provider is able to determine how the query expression tree is evaluated! The code that began our story was intentionally vague about the definition of the “products” collection. Maybe it is a queryable in-memory collection of products: var products = new[]     { new Product { Name = "Widget", ProductID = 1 } }.AsQueryable(); The in-memory LINQ provider works by rewriting Queryable method calls to Enumerable method calls in the query expression tree. It then compiles the expression tree and evaluates it. It should be mentioned that the provider does not blindly rewrite all Queryable calls. It only rewrites a call when its arguments have been rewritten in a way that introduces a type mismatch, e.g. the first argument to Queryable.Where<Product> being rewritten as an expression of type IEnumerable<Product> from IQueryable<Product>. The type mismatch is triggered initially by a “leaf” expression like the one associated with the AsQueryable query: when the provider recognizes one of its own leaf expressions, it replaces the expression with the original IEnumerable<> constant expression. I like to think of this rewrite process as “type irritation” because the rewritten leaf expression is like a foreign body that triggers an immune response (further rewrites) in the tree. The technique ensures that only those portions of the expression tree constructed by a particular provider are rewritten by that provider: no type irritation, no rewrite. Let’s consider the behavior of an alternative LINQ provider. If “products” is a collection created by a LINQ to SQL provider: var products = new NorthwindDataContext().Products; the provider rewrites the expression tree as a SQL query that is then evaluated by your favorite RDBMS. The predicate may ultimately be evaluated using an index! In this example, the expression associated with the Products property is the “leaf” expression. StreamInsight 2.1 For the in-memory LINQ to Objects provider, a leaf is an in-memory collection. For LINQ to SQL, a leaf is a table or view. When defining a “process” in StreamInsight 2.1, what is a leaf? To StreamInsight a leaf is logic: an adapter, a sequence, or even a query targeting an entirely different LINQ provider! How do we represent the logic? Remember that a standing query may outlive the client that provisioned it. A reference to a sequence object in the client application is therefore not terribly useful. But if we instead represent the code constructing the sequence as an expression, we can host the sequence in the server: using (var server = Server.Connect(...)) {     var app = server.Applications["my application"];     var source = app.DefineObservable(() => Observable.Range(0, 10, Scheduler.NewThread));     var query = from i in source where i % 2 == 0 select i; } Example 1: defining a source and composing a query Let’s look in more detail at what’s happening in example 1. We first connect to the remote server and retrieve an existing app. Next, we define a simple Reactive sequence using the Observable.Range method. Notice that the call to the Range method is in the body of an anonymous function. This is important because it means the source sequence definition is in the form of an expression, rather than simply an opaque reference to an IObservable<int> object. The variation in Example 2 fails. Although it looks similar, the sequence is now a reference to an in-memory observable collection: var local = Observable.Range(0, 10, Scheduler.NewThread); var source = app.DefineObservable(() => local); // can’t serialize ‘local’! Example 2: error referencing unserializable local object The Define* methods support definitions of operator tree leaves that target the StreamInsight server. These methods all have the same basic structure. The definition argument is a lambda expression taking between 0 and 16 arguments and returning a source or sink. The method returns a proxy for the source or sink that can then be used for the usual style of LINQ query composition. The “define” methods exploit the compile-time C# feature that converts anonymous functions into translatable expression trees! Query composition exploits the runtime pattern that allows expression trees to be constructed by operators taking queryable and expression (Expression<>) arguments. The practical upshot: once you’ve Defined a source, you can compose LINQ queries in the familiar way using query expressions and operator combinators. Notably, queries can be composed using pull-sequences (LINQ to Objects IQueryable<> inputs), push sequences (Reactive IQbservable<> inputs), and temporal sequences (StreamInsight IQStreamable<> inputs). You can even construct processes that span these three domains using “bridge” method overloads (ToEnumerable, ToObservable and To*Streamable). Finally, the targeted rewrite via type irritation pattern is used to ensure that StreamInsight computations can leverage other LINQ providers as well. Consider the following example (this example depends on Interactive Extensions): var source = app.DefineEnumerable((int id) =>     EnumerableEx.Using(() =>         new NorthwindDataContext(), context =>             from p in context.Products             where p.ProductID == id             select p.ProductName)); Within the definition, StreamInsight has no reason to suspect that it ‘owns’ the Queryable.Where and Queryable.Select calls, and it can therefore defer to LINQ to SQL! Let’s use this source in the context of a StreamInsight process: var sink = app.DefineObserver(() => Observer.Create<string>(Console.WriteLine)); var query = from name in source(1).ToObservable()             where name == "Widget"             select name; using (query.Bind(sink).Run("process")) {     ... } When we run the binding, the source portion which filters on product ID and projects the product name is evaluated by SQL Server. Outside of the definition, responsibility for evaluation shifts to the StreamInsight server where we create a bridge to the Reactive Framework (using ToObservable) and evaluate an additional predicate. It’s incredibly easy to define computations that span multiple domains using these new features in StreamInsight 2.1! Regards, The StreamInsight Team

    Read the article

  • 24 Hours of PASS (September 2014): Summit Preview Edition

    - by Sergio Govoni
    Which sessions you can expect to find at the next PASS Summit 2014 ? Find it out on September 09, 2014 (12:00 GMT) at the free online event: 24 Hours of PASS: Summit Preview Edition.Register now at this link.No matter from what part of the world you will follow the event, the important thing is to know that they will be 24 hours of continuous training on SQL Server and Business Intelligence on your computer!

    Read the article

  • SQL 2014 does data the way developers want

    - by Rob Farley
    A post I’ve been meaning to write for a while, good that it fits with this month’s T-SQL Tuesday, hosted by Joey D’Antoni (@jdanton) Ever since I got into databases, I’ve been a fan. I studied Pure Maths at university (as well as Computer Science), and am very comfortable with Set Theory, which undergirds relational database concepts. But I’ve also spent a long time as a developer, and appreciate that that databases don’t exactly fit within the stuff I learned in my first year of uni, particularly the “Algorithms and Data Structures” subject, in which we studied concepts like linked lists. Writing in languages like C, we used pointers to quickly move around data, without a database in sight. Of course, if we had a power failure all this data was lost, as it was only persisted in RAM. Perhaps it’s why I’m a fan of database internals, of indexes, latches, execution plans, and so on – the developer in me wants to be reassured that we’re getting to the data as efficiently as possible. Back when SQL Server 2005 was approaching, one of the big stories was around CLR. Many were saying that T-SQL stored procedures would be a thing of the past because we now had CLR, and that obviously going to be much faster than using the abstracted T-SQL. Around the same time, we were seeing technologies like Linq-to-SQL produce poor T-SQL equivalents, and developers had had a gutful. They wanted to move away from T-SQL, having lost trust in it. I was never one of those developers, because I’d looked under the covers and knew that despite being abstracted, T-SQL was still a good way of getting to data. It worked for me, appealing to both my Set Theory side and my Developer side. CLR hasn’t exactly become the default option for stored procedures, although there are plenty of situations where it can be useful for getting faster performance. SQL Server 2014 is different though, through Hekaton – its In-Memory OLTP environment. When you create a table using Hekaton (that is, a memory-optimized one), the table you create is the kind of thing you’d’ve made as a developer. It creates code in C leveraging structs and pointers and arrays, which it compiles into fast code. When you insert data into it, it creates a new instance of a struct in memory, and adds it to an array. When the insert is committed, a small write is made to the transaction to make sure it’s durable, but none of the locking and latching behaviour that typifies transactional systems is needed. Indexes are done using hashes and using bw-trees (which avoid locking through the use of pointers) and by handling each updates as a delete-and-insert. This is data the way that developers do it when they’re coding for performance – the way I was taught at university before I learned about databases. Being done in C, it compiles to very quick code, and although these tables don’t support every feature that regular SQL tables do, this is still an excellent direction that has been taken. @rob_farley

    Read the article

  • Placeholder images for testing reports

    - by Greg Low
    Lorem Ipsum has long been used to provide placeholder text for testing report and document layouts. Programs such as Microsoft Word have also included options for generating sample text. (For example, type =rand() anywhere in a blank area of a Microsoft Word document and hit enter).Matthew Roche and Donald Farmer both sent me a link the other day to an online service that provides placeholder images. This could be quite useful when testing report layouts in SQL Server Reporting Services.You'll find it here: http://lorempixel.com/Nice! As an example, here's a random sports image. Of course I have no idea what you'll see on this page :-)

    Read the article

  • Social media and special characters

    - by John Paul Cook
    I’ve previously blogged about using Unicode with T-SQL to put superscripts, subscripts, and special characters into text strings. Unicode is also useful in formatting social media such as Facebook, Twitter, and that dinosaur otherwise known as email. When you can’t set properties of text such as italicizing the subject line of an email message or adding subscripts to a Facebook post, Unicode can make it possible. There are Unicode characters that are intrinsically italicized. Others are intrinsically...(read more)

    Read the article

  • How In-Memory Database Objects Affect Database Design: Hybrid Code

    - by drsql
    In my first attempts at building my code, I strictly went with either native or on-disk code. I specifically wrote the on-disk code to only use features that worked in-memory. This lead to one majorly silly bit of code, used to create system assigned key values. How would I create a customer number that was unique. We can’t use the Max(value) + 1 approach because it will be very hideous with MVCC isolation levels, since 100 connections might see the same value, leading to lots of duplication. You...(read more)

    Read the article

  • SQLU Professional Development Week: The Difference Between Your Business and Community Presence

    - by andyleonard
    Introduction Proto-earth , dinosaurs , and the days when your personal and business profiles were separate and distinct. What do these things all have in common? They are all in the past. Background Checks Corporate background checks now routinely include a search for social media profiles, forum posts, and blogs. As professionals are learning, the things you say and do in your "off-work hours" can and will be used against you - even after you're hired and have been doing the job awhile . One point?...(read more)

    Read the article

  • More than one way to skin an Audit

    - by BuckWoody
    I get asked quite a bit about auditing in SQL Server. By "audit", people mean everything from tracking logins to finding out exactly who ran a particular SELECT statement. In the really early versions of SQL Server, we didn't have a great story for very granular audits, so lots of workarounds were suggested. As time progressed, more and more audit capabilities were added to the product, and in typical database platform fashion, as we added a feature we didn't often take the others away. So now, instead of not having an option to audit actions by users, you might face the opposite problem - too many ways to audit! You can read more about the options you have for tracking users here: http://msdn.microsoft.com/en-us/library/cc280526(v=SQL.100).aspx  In SQL Server 2008, we introduced SQL Server Audit, which uses Extended Events to really get a simple way to implement high-level or granular auditing. You can read more about that here: http://msdn.microsoft.com/en-us/library/dd392015.aspx  As with any feature, you should understand what your needs are first. Auditing isn't "free" in the performance sense, so you need to make sure you're only auditing what you need to. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Did You Know? I'm doing 3 more online seminars with SSWUG!

    - by Kalen Delaney
    As I told you in April , I recorded two more seminars with Stephen Wynkoop, on aspects of Query Processing. The first one will be broadcast on June 30 and the second on August 27. In between, we'll broadcast my Index Internals seminar, on July 23. Workshops can be replayed for up to a week after the broadcast, and you can even buy a DVD of the workshop. You can get more details by clicking on the workshop name, below, or check out the announcement on the SSWUG site at http://www.sswug.org/editorials/default.aspx?id=1948...(read more)

    Read the article

  • Messages do not always appear in [catalog].[event_messages] in the order that they occur [SSIS]

    - by jamiet
    This is a simple heads up for anyone doing SQL Server Integration Services (SSIS) development using SSIS 2012. Be aware that messages do not always appear in [catalog].[event_messages] in the order that they occur, observe… In the following query I am looking at a subset of messages in [catalog].[event_messages] and ordering them by [event_message_id]: SELECT [event_message_id],[event_name],[message_time],[message_source_name]FROM   [catalog].[event_messages] emWHERE  [event_message_id] BETWEEN 290972 AND 290982ORDER  BY [event_message_id] ASC--ORDER BY [message_time] ASC Take a look at the two rows that I have highlighted, note how the OnPostExecute event for “Utility GetTargetLoadDatesPerETLIfcName” appears after the OnPreExecute event for “FELC Loop over TargetLoadDates”, I happen to know that this is incorrect because “Utility GetTargetLoadDatesPerETLIfcName” is a package that gets executed by an Execute Package Task prior to the For Each Loop “FELC Loop over TargetLoadDates”: If we order instead by [message_time] then we see something that makes more sense: SELECT [event_message_id],[event_name],[message_time],[message_source_name]FROM   [catalog].[event_messages] emWHERE  [event_message_id] BETWEEN 290972 AND 290982--ORDER BY [event_message_id] ASCORDER  BY [message_time] ASC We can see that the OnPostExecute for “Utility GetTargetLoadDatesPerETLIfcName” did indeed occur before the OnPreExecute event for “FELC Loop over TargetLoadDates”, they just did not get assigned an [event_message_id] in chronological order. We can speculate as to why that might be (I suspect the explanation is something to do with the two executables appearing in different packages) but the reason is not the important thing here, just be aware that you should be ordering by [message_time] rather than [event_message_id] if you want to get 100% accurate insights into your executions. @Jamiet

    Read the article

  • Summit reflections

    - by Rob Farley
    So far, my three PASS Summit experiences have been notably different to each other. My first, I wasn’t on the board and I gave two regular sessions and a Lightning Talk in which I told jokes. My second, I was a board advisor, and I delivered a precon, a spotlight and a Lightning Talk in which I sang. My third (last week), I was a full board director, and I didn’t present at all. Let’s not talk about next year. I’m not sure there are many options left. This year, I noticed that a lot more people recognised me and said hello. I guess that’s potentially because of the singing last year, but could also be because board elections can bring a fair bit of attention, and because of the effort I’ve put in through things like 24HOP... Yeah, ok. It’d be the singing. My approach was very different though. I was watching things through different eyes. I looked for the things that seemed to be working and the things that didn’t. I had staff there again, and was curious to know how their things were working out. I knew a lot more about what was going on behind the scenes to make various things happen, and although very little about the Summit was actually my responsibility (based on not having that portfolio), my perspective had moved considerably. Before the Summit started, Board Members had been given notebooks – an idea Tom (who heads up PASS’ marketing) had come up with after being inspired by seeing Bill walk around with a notebook. The plan was to take notes about feedback we got from people. It was a good thing, and the notebook forms a nice pair with the SQLBits one I got a couple of years ago when I last spoke there. I think one of the biggest impacts of this was that during the first keynote, Bill told everyone present about the notebooks. This set a tone of “we’re listening”, and a number of people were definitely keen to tell us things that would cause us to pull out our notebooks. PASSTV was a new thing this year. Justin, the host, featured on the couch and talked a lot of people about a lot of things, including me (he talked to me about a lot of things, I don’t think he talked to a lot people about me). Reaching people through online methods is something which interests me a lot – it has huge potential, and I love the idea of being able to broadcast to people who are unable to attend in person. I’m keen to see how this medium can be developed over time. People who know me will know that I’m a keen advocate of certification – I've been SQL certified since version 6.5, and have even been involved in creating exams. However, I don’t believe in studying for exams. I think training is worthwhile for learning new skills, but the goal should be on learning those skills, not on passing an exam. Exams should be for proving that the skills are there, not a goal in themselves. The PASS Summit is an excellent place to take exams though, and with an attitude of professional development throughout the event, why not? So I did. I wasn’t expecting to take one, but I was persuaded and took the MCM Knowledge Exam. I hadn’t even looked at the syllabus, but tried it anyway. I was very tired, and even fell asleep at one point during it. I’ll find out my result at some point in the future – the Prometric site just says “Tested” at the moment. As I said, it wasn’t something I was expecting to do, but it was good to have something unexpected during the week. Of course it was good to catch up with old friends and make new ones. I feel like every time I’m in the US I see things develop a bit more, with more and more people knowing who I am, who my staff are, and recognising the LobsterPot brand. I missed being a presenter, but I definitely enjoyed seeing many friends on the list of presenters. I won’t try to list them, because there are so many these days that people might feel sad if I don’t mention them. For those that I managed to see, I was pleased to see that the majority of them have lifted their presentation skills since I last saw them, and I happily told them as much. One person who I will mention was Paul White, who travelled from New Zealand to his first PASS Summit. He gave two sessions (a regular session and a half-day), packed large rooms of people, and had everyone buzzing with enthusiasm. I spoke to him after the event, and he told me that his expectations were blown away. Paul isn’t normally a fan of crowds, and the thought of 4000 people would have been scary. But he told me he had no idea that people would welcome him so well, be so friendly and so down to earth. He’s seen the significance of the SQL Server community, and says he’ll be back. It’ll be good to see him there. Will you be there too?

    Read the article

  • After changing web host, I get a 'file does not exist' error

    - by Jordan
    I run a WordPress blog, and have recently changed web hosts. When changing web hosts, I copied all files and exported/imported the database etc as explained by lots of tutorials found easily on Google. The blog home page works fine. What goes wrong: When I click on any link from the home page, the browser gets stuck in a redirect loop. Looking at the error log, I see: File does not exist: /usr/local/apache/htdocs/index.php The directory /usr doesn't even exist for my website - so perhaps this is looking for a file that was present using my old Web Host and is no longer present with my new web host? What is going on, and how might I resolve it?

    Read the article

  • Working with Reporting Services Filters–Part 5: OR Logic

    - by smisner
    When you combine multiple filters, Reporting Services uses AND logic. Once upon a time, there was actually a drop-down list for selecting AND or OR between filters which was very confusing to people because often it was grayed out. Now that selection is gone, but no matter. It wouldn’t help us solve the problem that I want to describe today. As with many problems, Reporting Services gives us more than one way to apply OR logic in a filter. If I want a filter to include this value OR that value for the same field, one approach is to set up the filter is to use the IN operator as I explained in Part 1 of this series. But what if I want to base the filter on two different fields? I  need a different solution. Using the AdventureWorksDW2008R2 database, I have a report that lists product sales: Let’s say that I want to filter this report to show only products that are Bikes (a category) OR products for which sales were greater than $1,000 in a year. If I set up the filter like this: Expression Data Type Operator Value [Category] Text = Bikes [SalesAmount]   > 1000 Then AND logic is used which means that both conditions must be true. That’s not the result I want. Instead, I need to set up the filter like this: Expression Data Type Operator Value =Fields!EnglishProductCategoryName.Value = "Bikes" OR Fields!SalesAmount.Value > 1000 Boolean = =True The OR logic needs to be part of the expression so that it can return a Boolean value that we test against the Value. Notice that I have used =True rather than True for the value. The filtered report appears below. Any non-bike product appears only if the total sales exceed $1,000, whereas Bikes appear regardless of sales. (You can’t see it in this screenshot, but Mountain-400-W Silver, 38 has sales of $923 in 2007 but gets included because it is in the Bikes category.)

    Read the article

  • Free tools for SQL Server - Automating Execution Plan Analysis

    - by jchang
    Since this topic is being discussed, I will plug my own tools, SQL Exec Stats and (a little dated) documentation the main capability is cross-referencing index usuage with specific execution plans. another feature is generating execution plans for all stored procedures in a database, along with the index usage cross-reference. There are several sources of execution plans or plan handles, this could be a live trace, a previously saved trace, previously saved sqlplan files, from dm_exec_cached_plans,...(read more)

    Read the article

  • Cloud Computing Pricing - It's like a Hotel

    - by BuckWoody
    I normally don't go into the economics or pricing side of Distributed Computing, but I've had a few friends that have been surprised by a bill lately and I wanted to quickly address at least one aspect of it. Most folks are used to buying software and owning it outright - like buying a car. We pay a lot for the car, and then we use it whenever we want. We think of the "cloud" services as a taxi - we'll just pay for the ride we take an no more. But it's not quite like that. It's actually more like a hotel. When you subscribe to Azure using a free offering like the MSDN subscription, you don't have to pay anything for the service. But when you create an instance of a Web or Compute Role, Storage, that sort of thing, you can think of the idea of checking into a hotel room. You get the key, you pay for the room. For Azure, using bandwidth, CPU and so on is billed just like it states in the Azure Portal. so in effect there is a cost for the service and then a cost to use it, like water or power or any other utility. Where this bit some folks is that they created an instance, played around with it, and then left it running. No one was using it, no one was on - so they thought they wouldn't be charged. But they were. It wasn't much, but it was a surprise.They had the hotel room key, but they weren't in the room, so to speak. To add to their frustration, they had to talk to someone on the phone to cancel the account. I understand the frustration. Although we have all this spelled out in the sign up area, not everyone has the time to read through all that. I get that. So why not make this easier? As an explanation, we bill for that time because the instance is still running, and we have to tie up resources to be available the second you want them, and that costs money. As far as being able to cancel from the portal, that's also something that needs to be clearer. You may not be aware that you can spin up instances using code - and so cancelling from the Portal would allow you to do the same thing. Since a mistake in code could erase all of your instances and the account, we make you call to make sure you're you and you really want to take it down. Not a perfect system by any means, but we'll evolve this as time goes on. For now, I wanted to make sure you're aware of what you should do. By the way, you don't have to cancel your whole account not to be billed. Just delete the instance from the portal and you won't be charged. You don't have to call anyone for that. And just FYI - you can download the SDK for Azure and never even hit the online version at all for learning and playing around. No sign-up, no credit card, PO, nothing like that. In fact, that's how I demo Azure all the time. Everything runs right on your laptop in an emulated environment.  

    Read the article

  • Build in better usability with UX Direct

    - by JuergenKress
    The Oracle Applications User Experience team has created a program called Oracle UX Direct to provide customers, partners, and consultants in the enterprise industry with design best-practices and tools that they can leverage to make their enterprise implementations more successful. Read the Voice of User Experience, or VoX, blog to learn more about why the program was created, and visit the UX Direct web site to find out how to introduce design thinking during the implementation stage. Create a solution that best fits the needs of users from the beginning. Read more about UX Direct on VoX. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Mix Forum Technorati Tags: UX,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Free Webinar - Using Enterprise Data Integration Dashboards

    - by andyleonard
    Join Kent Bradshaw and me as we present Using Enterprise Data Integration Dashboards Tuesday 11 Dec 2012 at 10:00 AM ET! If data is the life of the modern organization, data integration is the heart of an enterprise. Data circulation is vital. Data integration dashboards provide enterprise ETL (Extract, Transform, and Load) teams near-real-time status supported with historical performance analysis. Join Linchpins Kent Bradshaw and Andy Leonard as they demonstrate and discuss the benefits of data...(read more)

    Read the article

< Previous Page | 115 116 117 118 119 120 121 122 123 124 125 126  | Next Page >