Search Results

Search found 10442 results on 418 pages for 'blog'.

Page 92/418 | < Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >

  • Making meetings much more efficient

    - by John Paul Cook
    Water. Yes, that’s simple. There needs to be a rule that whoever convenes a meeting be required to drink water. Half a liter of water when the meeting begins and another half a liter every half hour thereafter. That simple rule will motivate the meeting organizer to find closure quickly – and the exit. What day did I post this? Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!...(read more)

    Read the article

  • SQL Down Under podcast 60 with SQL Server MVP Adam Machanic

    - by Greg Low
    I managed to get another podcast posted over the weekend. Late last week, I managed to get a show recorded with Adam Machanic. Adam's always fascinating. In this show, he's talking about what he's found regarding increasing query performance using parallelism. Late in the show, he gives his thoughts on a number of topics related to the upcoming SQL Server 2014.Enjoy!The show is online now: http://www.sqldownunder.com/Podcasts 

    Read the article

  • Working with Reporting Services Filters–Part 1

    - by smisner
    There are two ways that you can filter data in Reporting Services. The first way, which usually provides a faster performance, is to use query parameters to apply a filter using the WHERE clause in a SQL statement. In that case, the structure of the filter depends upon the syntax recognized by the source database. Another way to filter data in Reporting Services is to apply a filter to a dataset, data region, or a group. Using this latter method, you can even apply multiple filters. However, the use of filter operators or the setup of multiple filters is not always obvious, so in this series of posts, I'll provide some more information about the configuration of filters. First, why not use query parameters exclusively for filtering? Here are a few reasons: You might want to apply a filter to part of the report, but not all of the report. Your dataset might retrieve data from a stored procedure, and doesn't allow you to pass a query parameter for filtering purposes. Your report might be set up as a snapshot on the report server and, in that case, cannot be dynamically filtered based on a query parameter. Next, let's look at how to set up a report filter in general. The process is the same whether you are applying the filter to a dataset, data region, or a group. When you go to the Filters page in the Properties dialog box for whichever of these items you selected (dataset, data region, group), you click the Add button to create a new filter. The interface looks like this: The Expression field is usually a field in the dataset, so to make it easier for you to make a selection,the drop-down list displays all of the current dataset fields. But notice the expression button to the right, which means that you can set up any type of expression-not just a dataset field. To the right of the expression button, you'll find a data type drop-down list. It's important to specify the correct data type for the field or expression you're using. Now for the operators. Here's a list of the options that you have: This Operator Performs This Action =, <>, >, >=, <, <=, Like Compares expression to value Top N, Bottom N Compares expression to Top (Bottom) set of N values (N = integer) Top %, Bottom % Compares expression to Top (Bottom) N percent of values (N = integer or float) Between Determines whether expression is between two values, inclusive In Determines whether expression is found in list of values Last, the Value is what you're comparing to the expression using the operator. The construction of a filter using some operators (=, <>, >, etc.) is fairly simple. If my dataset (for AdventureWorks data) has a Category field, and I have a parameter that prompts the user for a single category, I can set up a filter like this: Expression Data Type Operator Value [Category] Text = [@Category] But if I set the parameter to accept multiple values, I need to change the operator from = to In, just as I would have to do if I were using a query parameter. The parameter expression, [@Category], which translates to =Parameters!Category.Value, doesn’t need to change because it represents an array as soon as I change the parameter to allow multiple values. The “In” operator requires an array. With that in mind, let’s consider a variation on Value. Let’s say that I have a parameter that prompts the user for a particular year – and for simplicity’s sake, this parameter only allows a single value, and I have an expression that evaluates the previous year based on the user’s selection. Then I want to use these two values in two separate filters with an OR condition. That is, I want to filter either by the year selected OR by the year that was computed. If I create two filters, one for each year (as shown below), then the report will only display results if BOTH filter conditions are met – which would never be true. Expression Data Type Operator Value [CalendarYear] Integer = [@Year] [CalendarYear] Integer = =Parameters!Year.Value-1 To handle this scenario, we need to create a single filter that uses the “In” operator, and then set up the Value expression as an array. To create an array, we use the Split function after creating a string that concatenates the two values (highlighted in yellow) as shown below. Expression Data Type Operator Value =Cstr(Fields!CalendarYear.Value) Text In =Split( CStr(Parameters!Year.Value) + ”,” + CStr(Parameters!Year.Value-1) , “,”) Note that in this case, I had to apply a string conversion on the year integer so that I could concatenate the parameter selection with the calculated year. Pay attention to the second argument of the Split function—you must use a comma delimiter for the result to work correctly with the In operator. I also had to change the Expression value from [CalendarYear] (or =Fields!CalendarYear.Value) so that the expression would return a string that I could compare with the values in the string array. More fun with filter expressions in future posts!

    Read the article

  • Silverlight Cream for May 22, 2010 -- #867

    - by Dave Campbell
    In this Issue: Michael Washington, Xianzhong Zhu, Jim Lynn, Laurent Bugnion, and Kyle McClellan. A ton of Shoutouts this time: Cigdem Patlak (CrocusGirl) is interviewed about Silverlight 4 on Channel 9: Silverlight discussion with Cigdem Patlak Timmy Kokke has material up from a presentation he did, and check out the SilverAmp project he's got going: Code & Slides – SDE – What’s new in Silverlight 4 Graham Odds at ScottLogic has an interesting post up: Contextual cues in user interface design Einar Ingebrigtsen is discussing Balder licensing and is asking for input: Balder - Licensing SilverLaw has updated two of his stylings at the Expression Gallery to Silverlight 4: ChildWindow and Accordion Styling Silverlight 4 Keep this page bookmarked -- it's the only page you'll need for Silverlight and Expression links.. well, that and my blog :) .. from Adam Kinney: Silverlight and Expression Blend Jeremy Boyd and John-Daniel Trask have some sweet-looking controls in their new release: Introducing Silverlight Elements 1.1 Matthias Shapiro entered the Design for America competition with his Recovery Review: A Silverlight Sunlight Foundation Visualization Project be sure to check out his blog post about it -- there's a link at the bottom. Koen Zwikstra announed a new release: Document Toolkit 2 Beta 1 available ... built for SL4 and lots of features -- check out the blog post. From SilverlightCream.com: Simple Example To Secure WCF Data Service OData Methods Michael Washington has a follow-on tutorial up on WCF Data Security with OData -- essentially this is the 'securing the data' part ... the Silverlight part was in the previous post... all code is available. Developing Freecell Game Using Silverlight 3 Part 1 Xianzhong Zhu has the first of a two-part tutorial up on building Freecell in Silverlight 3 ... yeah... SL3 -- oh, can you say WP7?? :) Silverlight Top Tip: Startup page for Navigation Apps Jim Lynn has detailed how to go straight to a specific page you're working on in a complex Silverlight app say for debug purposes rather than page/page/page ... I was just thinking yesterday about putting a shortcut on my taskbar for something similar in .NET :) Handling DataGrid.SelectedItems in an MVVM-friendly manner Laurent Bugnion responded with code to a question about getting a DataGrid's SelectedItems into the ViewModel in MVVMLight. Demo code available too. RIA Services and Windows Live ID Kyle McClellan has a post up discussing using LiveID and RIA Services and Silverlight. Lots of external links sprinkled around. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • List SQL Server Instances using the Registry

    - by BuckWoody
    I read this interesting article on using PowerShell and the registry, and thought I would modify his information a bit to list the SQL Server Instances on a box. The interesting thing about listing instances this was is that you can touch remote machines, find the instances when they are off and so on. Anyway, here’s the scriptlet I used to find the Instances on my system: $MachineName = '.' $reg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine', $MachineName) $regKey= $reg.OpenSubKey("SOFTWARE\\Microsoft\\Microsoft SQL Server\\Instance Names\\SQL" ) $regkey.GetValueNames() You can read more of his article to find out the reason for the remote registry call and so forth – there are also security implications here for being able to read the registry. Script Disclaimer, for people who need to be told this sort of thing: Never trust any script, including those that you find here, until you understand exactly what it does and how it will act on your systems. Always check the script on a test system or Virtual Machine, not a production system. Yes, there are always multiple ways to do things, and this script may not work in every situation, for everything. It’s just a script, people. All scripts on this site are performed by a professional stunt driver on a closed course. Your mileage may vary. Void where prohibited. Offer good for a limited time only. Keep out of reach of small children. Do not operate heavy machinery while using this script. If you experience blurry vision, indigestion or diarrhea during the operation of this script, see a physician immediately. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Converting #MDX to #DAX and PowerPivot Workshop online #ppws

    - by Marco Russo (SQLBI)
    I just published the article Converting MDX to DAX – First Steps on the renewed SQLBI web site about converting MDX to DAX. The reason is that with BISM Tabular in Analysis Services 2012 you will be able to write queries in both DAX and MDX. If you already know MDX, you might wonder how to “translate” your MDX knowledge in DAX. I think that this is another way you can improve your knowledge about DAX: it has different concepts behind and this comparison should be helpful in this purpose. This is...(read more)

    Read the article

  • First Post

    - by Allan Ritchie
    It has been a while since I've had a blog, but I'm back into the open source dev and decided to get back into things.  I had a blog a few years back when NHibernate was infant (0.8 or something) and I was working with the Wilson ORMapper (www.ormapper.net) at the time.  Anyhow, I'm still working with NHibernate (particularily the exciting v3 alpha 1) and Castle framework. I've also written a .NET ExtDirect stack for which I'll be writing a few articles around due to its flexibility.  I decided to write yet another communication stack because all the implementations I found on the Ext forums were lacking any sort of flexibility.  So stay tuned... I'll be presenting a bunch of the extension points.

    Read the article

  • XMLPad – a new tool in my developer utility belt

    - by jamiet
    Yesterday I was on the lookout for a free tool that would help me write XPath statements. I put a shout out on Twitter and Johan Barnard replied saying : Give XMLPad a try http://www.wmhelp.com/xmlpad3.htm I’m sure there are legions of developers out there that know all about XMLPad but I had never heard about it so I suspect some of you reading haven’t either. Today I downloaded it to give it a run out and I gotta say – I love it. I only used it to do one thing –constructing an XPath expression to point to a particular Configuration definition in a .dtsx file- and it allowed me to do that with consummate ease. The feature I particularly loved was that, similar to Google Suggest, it showed me results from my expression as I typed. Here is a screenshot of my XPath expression to find (and just try saying this in a hurry) the value of a property whose DTS:Name attribute equals ‘ConfigurationString’ of a Configuration definition where the value of that Configuration definition’s property whose DTS:Name attribute equals ‘ObjectName’, equals ‘BIConfig My XPath expression: /DTS:Executable/DTS:Configuration[DTS:Property[@DTS:Name=’ObjectName’]=’BIConfig’]/DTS:Property[@DTS:Name=’ConfigurationString’] and believe me, there was no way I would have been able to come up with that without a tool to help me! So, an easy tip for you – if you need to write XPath expression download XMLPad for free from http://www.wmhelp.com/xmlpad3.htm and see what it can do for you. That’s all. Its now Friday evening and I’m shutting down and relaxing before heading to the big game at Twickenham tomorrow (yes, I have a ticket ). Have a good one! @Jamiet

    Read the article

  • SSIS Reporting Pack update

    - by jamiet
    Its been a while since I last posted anything in regard to SSIS Reporting Pack, the most recent release being on 27th May 2012, so here is a short update. There is still lots of work to do on SSIS Reporting Pack; lots more features to add, lots of performance work to be done, and a few bug fixes too. I have also been (fairly) hard at work on a framework to be used in conjunction with SSIS 2012 that I refer to as the Restart Framework (currently residing at http://ssisrestartframework.codeplex.com/). There is still much work to be done on the Restart Framework (not least some useful documentation on how to use it) which is why I haven’t mentioned it publicly before now although I am actively checking in changes. One thing I am considering is amalgamating the two projects into one; this would mean I could build a suite of reports that both work against the SSIS Catalog (what you currently know as “SSIS Reporting Pack”) and also against this Restart Framework thing. No decision has been made as yet though. There have been a number of bug reports and feature suggestions for SSIS Reporting Pack added to the Issue Tracker. Thank you to everyone that has submitted something, rest assured I am not going to ignore them forever; my time is at a premium right now unfortunately due to … well … life… so working on these items isn’t near the top of my priority list. Lastly, I am actively using SSIS Reporting Pack in a production environment right now and I’m happy to report that it is proving to be very useful. One of the reports that I have put a lot of time into is execution executable duration.rdl and its proving very adept at easily identifying bottlenecks in our SSIS 2012 executions: The report allows you to browse through the hierarchy of executables in each execution and each bar represents the duration of each executable in relation to all the other executables; longer bars being a good indication of where problems might lie. The colour of the bar indicates whether it was successful or not (green=success). Hovering over a bar brings up a tooltip showing more information about that executable. Clicking on a bar allows you to compare this particular instance of the executable against other executions. Please do let me know if you are using SSIS Reporting Pack. I would like to hear any anecdotes you might have, good or bad. @Jamiet

    Read the article

  • Blogspot as a simple CMS

    - by G1ug
    Blogger/Blogspot recently released a new version of their software. This new version appears to have features relevant to a simple CMS (static page, albeit limited). I read from their Buzz Blog about a few websites that don't necessarily look like a typical Blogspot blog but rather somewhat a typical website deployed using a minimal CMS software: http://buzz.blogger.com/2011/07/you-can-do-some-amazing-things-with.html Can anyone point resources where I can learn how to do these? (Preferably case-studies with some steps how to create such website as oppose to Blogger HOWTO). Plus point if you can also tell me the infrastructure of Blogger.com (software stack, etc). Thanks

    Read the article

  • Watch @marcorus and @ferrarialberto sessions online #teched #msteched #tee2012

    - by Marco Russo (SQLBI)
    In June I participated to two TechEd editions (North America and Europe). I and Alberto delivered a Pre Conference and two sessions about Tabular. Both conferences provides recorded sessions freely available on Channel 9 so that you can compare which one has been delivered in the best way! If you have to choose between the two versions, consider that in North America we receive more questions during and after the session (still recording), increasing the interaction, whereas in Europe questions usually comes after the session finished (so no recording available). If you’re curious, watch both and let me know which version you prefer, especially for Multidimensional vs Tabular! BISM: Multidimensional vs. Tabular (TechEd North America 2012) BISM: Multidimensional vs. Tabular (TechEd Europe 2012) Many-to-Many Relationships in BISM Tabular (TechEd North America 2012) Many-to-Many Relationships in BISM Tabular (TechEd Europe 2012) If you are interested to learn SSAS Tabular, don’t miss the next SSAS Tabular Workshop online on September 3-4, 2012. We are also planning dates for another roadshow in Europe this fall and I’m happy to announce we’ll have two dates in Germany, too. More updates in the coming weeks.

    Read the article

  • Domain forwarding with url substitution in the address bar

    - by Mario Duarte
    Hello, I have a blog being served by a machine I have at home. Since the ip can change i set up a dyndns domain to always point to that machine. However, I purchased a more friendly domain (at godaddy.com) and I would like to forward it to that blog. The problem is that if I simply forward it the users will see the dyndns domain in the address bar and could potentially bookmark those urls and that's a problem. I noticed that godaddy.com has domain masking and although it does hide the dyndns domain in the address bar, it also keeps the same root address in the address bar even if I navigate to another page. I also have the feeling that search engines will not like this domain masking thing. Does anyone know how can I accomplish what I want?

    Read the article

  • Silverlight TV 20: Community Driven Development with WCF RIA Services

    In this episode, John talks with Jeff Handley about how the community's feedback really helped shape some features in WCF RIA Services. Jeff is very active in the community and has a wealth of knowledge about WCF RIA Services. Relevant links: John's Blog and John on Twitter Jeff's Blog and Jeff on Twitter WCF RIA Services ContosoSales sample application shown in the episode Silverlight 4 RC Features (or download here) Follow us on Twitter @SilverlightTV Silverlight Training...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Query Tuning Mastery at PASS Summit 2012: The Demos

    - by Adam Machanic
    For the second year in a row, I was asked to deliver a 500-level "Query Tuning Mastery" talk in room 6E of the Washington State Convention Center, for the PASS Summit. ( Here's some information about last year's talk, on workspace memory. ) And for the second year in a row, I had to deliver said talk at 10:15 in the morning, in a room used as overflow for the keynote, following a keynote speaker that didn't stop speaking on time. Frustrating! Last Thursday, after very, very quickly setting up and...(read more)

    Read the article

  • Linked servers and performance impact: Direction matters!

    - by Linchi Shea
    When you have some data on a SQL Server instance (say SQL01) and you want to move the data to another SQL Server instance (say SQL02) through openquery(), you can either push the data from SQL01, or pull the data from SQL02. To push the data, you can run a SQL script like the following on SQL01, which is the source server: -- The push script -- Run this on SQL01 use testDB go insert openquery(SQL02, 'select * from testDB.dbo.target_table') select * from source_table; To pull the data, you can run...(read more)

    Read the article

  • Learning PostgreSql: bulk loading data

    - by Alexander Kuznetsov
    In this post we shall start loading data in bulk. For better performance of inserts, we shall load data into a table without constraints and indexes. This sounds familiar. There is a bulk copy utility, and it is very easy to invoke from C#. The following code feeds the output from a T-SQL stored procedure into a PostgreSql table: using ( var pgTableTarget = new PgTableTarget ( PgConnString , "Data.MyPgTable" , GetColumns ())) using ( var conn = new SqlConnection ( connectionString )) { conn.Open...(read more)

    Read the article

  • ALT.NET Seattle

    - by GeekAgilistMercenary
    Time to rock the ALT.NET scene and head up to the conference this weekend.  I must say, out of all the conferences I have been to the ALT.NET Conference is by far one of the best.  Great minds, great attitudes, awesome chances to learn, awesome changes to expand on one's ideas with others that hit on the same hurdles!  All in all, last year was great and I am expecting it to be a great conference this year also. For more information check out the ALT.NET site: http://2010conf.altnetseattle.org/ To get more involved in the monthly ALT.NET events in Seattle: http://groups.google.com/group/altnetseattle http://www.facebook.com/group.php?gid=111345965570 http://www.altnetseattle.org/ If you are in the Seattle area this weekend, be sure to hit up the conference. For original entry and other blog entries check out my personal blog.

    Read the article

  • Performance impact: What is the optimal payload for SqlBulkCopy.WriteToServer()?

    - by Linchi Shea
    For many years, I have been using a C# program to generate the TPC-C compliant data for testing. The program relies on the SqlBulkCopy class to load the data generated by the program into the SQL Server tables. In general, the performance of this C# data loader is satisfactory. Lately however, I found myself in a situation where I needed to generate a much larger amount of data than I typically do and the data needed to be loaded within a confined time frame. So I was driven to look into the code...(read more)

    Read the article

  • How can I parse Amazon S3 log files?

    - by artlung
    What are the best options for parsing Amazon S3 (Simple Storage) log files? I've turned on logging and now I have log files that look like this: 858e709ba90996df37d6f5152650086acb6db14a67d9aaae7a0f3620fdefb88f files.example.com [08/Jul/2010:10:31:42 +0000] 68.114.21.105 65a011a29cdf8ec533ec3d1ccaae921c 13880FBC9839395C REST.GET.OBJECT example.com/blog/wp-content/uploads/2006/10/kitties_we_cant_stop_here_this_is_bat_country.jpg "GET /example.com/blog/wp-content/uploads/2006/10/kitties_we_cant_stop_here_this_is_bat_country.jpg HTTP/1.1" 200 - 32957 32957 12 10 "http://atlanta.craigslist.org/forums/?act=Q&ID=163218891" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.19) Gecko/2010031422 Firefox/3.0.19" - What are the best options for automating the log files? I'm not using any other Amazon services other than S3.

    Read the article

  • SQL in the City (Charlotte) Wrap Up

    - by drsql
    Ok, it has been quite a while since the event, two weeks and a day to be exact, but I needed a rest before hitting Windows Live Writer again. Speaking is exhausting, traveling is exhausting, and well, I replaced my laptop and had to get all of my software back together. (Between Windows 8.1 sync features, Dropbox and Skydrive, it has never been easier…but I digress.) There are plenty of great vendors out there, but one of my favorites has always been Red-Gate. I have written half of a book with them,...(read more)

    Read the article

  • Can you over-normalize?

    - by drsql
    Now, don’t get too excited and grab your pitchforks and torches. Clearly, it is extremely possible to overdo something in the design, but very often normalization takes the rap as being the culprit. In my “Database Design Fundamentals” presentation, one of my favorite things to do is ask “What is the most important normal form?” 9 out of 10 times, someone answers “Third”. When I ask what they have against fourth, the usually say that it makes the database work too slow. But when they find out that...(read more)

    Read the article

  • Windows Azure Use Case: New Development

    - by BuckWoody
    This is one in a series of posts on when and where to use a distributed architecture design in your organization's computing needs. You can find the main post here: http://blogs.msdn.com/b/buckwoody/archive/2011/01/18/windows-azure-and-sql-azure-use-cases.aspx Description: Computing platforms evolve over time. Originally computers were directed by hardware wiring - that, the “code” was the path of the wiring that directed an electrical signal from one component to another, or in some cases a physical switch controlled the path. From there software was developed, first in a very low machine language, then when compilers were created, computer languages could more closely mimic written statements. These language statements can be compiled into the lower-level machine language still used by computers today. Microprocessors replaced logic circuits, sometimes with fewer instructions (Reduced Instruction Set Computing, RISC) and sometimes with more instructions (Complex Instruction Set Computing, CISC). The reason this history is important is that along each technology advancement, computer code has adapted. Writing software for a RISC architecture is significantly different than developing for a CISC architecture. And moving to a Distributed Architecture like Windows Azure also has specific implementation details that our code must follow. But why make a change? As I’ve described, we need to make the change to our code to follow advances in technology. There’s no point in change for its own sake, but as a new paradigm offers benefits to our users, it’s important for us to leverage those benefits where it makes sense. That’s most often done in new development projects. It’s a far simpler task to take a new project and adapt it to Windows Azure than to try and retrofit older code designed in a previous computing environment. We can still use the same coding languages (.NET, Java, C++) to write code for Windows Azure, but we need to think about the architecture of that code on a new project so that it runs in the most efficient, cost-effective way in a Distributed Architecture. As we receive new requests from the organization for new projects, a distributed architecture paradigm belongs in the decision matrix for the platform target. Implementation: When you are designing new applications for Windows Azure (or any distributed architecture) there are many important details to consider. But at the risk of over-simplification, there are three main concepts to learn and architect within the new code: Stateless Programming - Stateless program is a prime concept within distributed architectures. Rather than each server owning the complete processing cycle, the information from an operation that needs to be retained (the “state”) should be persisted to another location c(like storage) common to all machines involved in the process.  An interesting learning process for Stateless Programming (although not unique to this language type) is to learn Functional Programming. Server-Side Processing - Along with developing using a Stateless Design, the closer you can locate the code processing to the data, the less expensive and faster the code will run. When you control the network layer, this is less important, since you can send vast amounts of data between the server and client, allowing the client to perform processing. In a distributed architecture, you don’t always own the network, so it’s performance is unpredictable. Also, you may not be able to control the platform the user is on (such as a smartphone, PC or tablet), so it’s imperative to deliver only results and graphical elements where possible.  Token-Based Authentication - Also called “Claims-Based Authorization”, this code practice means instead of allowing a user to log on once and then running code in that context, a more granular level of security is used. A “token” or “claim”, often represented as a Certificate, is sent along for a series or even one request. In other words, every call to the code is authenticated against the token, rather than allowing a user free reign within the code call. While this is more work initially, it can bring a greater level of security, and it is far more resilient to disconnections. Resources: See the references of “Nondistributed Deployment” and “Distributed Deployment” at the top of this article for more information with graphics:  http://msdn.microsoft.com/en-us/library/ee658120.aspx  Stack Overflow has a good thread on functional programming: http://stackoverflow.com/questions/844536/advantages-of-stateless-programming  Another good discussion on Stack Overflow on server-side processing is here: http://stackoverflow.com/questions/3064018/client-side-or-server-side-processing Claims Based Authorization is described here: http://msdn.microsoft.com/en-us/magazine/ee335707.aspx

    Read the article

  • Using Hadooop (HDInsight) with Microsoft - Two (OK, Three) Options

    - by BuckWoody
    Microsoft has many tools for “Big Data”. In fact, you need many tools – there’s no product called “Big Data Solution” in a shrink-wrapped box – if you find one, you probably shouldn’t buy it. It’s tempting to want a single tool that handles everything in a problem domain, but with large, complex data, that isn’t a reality. You’ll mix and match several systems, open and closed source, to solve a given problem. But there are tools that help with handling data at large, complex scales. Normally the best way to do this is to break up the data into parts, and then put the calculation engines for that chunk of data right on the node where the data is stored. These systems are in a family called “Distributed File and Compute”. Microsoft has a couple of these, including the High Performance Computing edition of Windows Server. Recently we partnered with Hortonworks to bring the Apache Foundation’s release of Hadoop to Windows. And as it turns out, there are actually two (technically three) ways you can use it. (There’s a more detailed set of information here: http://www.microsoft.com/sqlserver/en/us/solutions-technologies/business-intelligence/big-data.aspx, I’ll cover the options at a general level below)  First Option: Windows Azure HDInsight Service  Your first option is that you can simply log on to a Hadoop control node and begin to run Pig or Hive statements against data that you have stored in Windows Azure. There’s nothing to set up (although you can configure things where needed), and you can send the commands, get the output of the job(s), and stop using the service when you are done – and repeat the process later if you wish. (There are also connectors to run jobs from Microsoft Excel, but that’s another post)   This option is useful when you have a periodic burst of work for a Hadoop workload, or the data collection has been happening into Windows Azure storage anyway. That might be from a web application, the logs from a web application, telemetrics (remote sensor input), and other modes of constant collection.   You can read more about this option here:  http://blogs.msdn.com/b/windowsazure/archive/2012/10/24/getting-started-with-windows-azure-hdinsight-service.aspx Second Option: Microsoft HDInsight Server Your second option is to use the Hadoop Distribution for on-premises Windows called Microsoft HDInsight Server. You set up the Name Node(s), Job Tracker(s), and Data Node(s), among other components, and you have control over the entire ecostructure.   This option is useful if you want to  have complete control over the system, leave it running all the time, or you have a huge quantity of data that you have to bulk-load constantly – something that isn’t going to be practical with a network transfer or disk-mailing scheme. You can read more about this option here: http://www.microsoft.com/sqlserver/en/us/solutions-technologies/business-intelligence/big-data.aspx Third Option (unsupported): Installation on Windows Azure Virtual Machines  Although unsupported, you could simply use a Windows Azure Virtual Machine (we support both Windows and Linux servers) and install Hadoop yourself – it’s open-source, so there’s nothing preventing you from doing that.   Aside from being unsupported, there are other issues you’ll run into with this approach – primarily involving performance and the amount of configuration you’ll need to do to access the data nodes properly. But for a single-node installation (where all components run on one system) such as learning, demos, training and the like, this isn’t a bad option. Did I mention that’s unsupported? :) You can learn more about Windows Azure Virtual Machines here: http://www.windowsazure.com/en-us/home/scenarios/virtual-machines/ And more about Hadoop and the installation/configuration (on Linux) here: http://en.wikipedia.org/wiki/Apache_Hadoop And more about the HDInsight installation here: http://www.microsoft.com/web/gallery/install.aspx?appid=HDINSIGHT-PREVIEW Choosing the right option Since you have two or three routes you can go, the best thing to do is evaluate the need you have, and place the workload where it makes the most sense.  My suggestion is to install the HDInsight Server locally on a test system, and play around with it. Read up on the best ways to use Hadoop for a given workload, understand the parts, write a little Pig and Hive, and get your feet wet. Then sign up for a test account on HDInsight Service, and see how that leverages what you know. If you're a true tinkerer, go ahead and try the VM route as well. Oh - there’s another great reference on the Windows Azure HDInsight that just came out, here: http://blogs.msdn.com/b/brunoterkaly/archive/2012/11/16/hadoop-on-azure-introduction.aspx  

    Read the article

  • 7 reasons you had to be at JavaOne Latin America 2012

    - by Bruno.Borges
    Yesterday was 12/12/12, and everybody went crazy on Twitter with cool memes like this one. And maybe you are now wondering why I mentioned 7 (seven) on the blog title. Because I want to play numbers? Yes! Today is 7 days after JavaOne Latin America 2012 is over (... and I had to figure out an excuse for taking so long to blog about it...). So unless you were at JavaOne Latin America this year, here are 7 things you missed: OTN Lounge mini-theatreThere was a mini-theatre holding several lightning talks. We had people from SouJava JUG, GoJava JUG, Globalcode, and several other Java gurus and companies running demos, talks, and even more. For example, @drspockbr talked about the ScrumToys project, that demonstrates the power of JSF. Hands On Lab for JAX-RS and WebSocketsOne of the cool things to do during JavaOne is to come to these Hands On labs and really do something using new technologies with the help of experts. This one in particular, was covered by me, Arun Gupta, and Reza Rahman. The HOL had more people than laptops (and we had 48 laptops!) interested on understanding and learning about the new stuff that is coming within Java EE 7. Things like JAX-RS, Server-sent Events and WebSockets. Hey, if you want to try this HOL by yourself, it is available on Github, so go for it! If you have questions, just let me know! Java Community KeynoteThis keynote presented a lot of cool things like startups using Java in their projects, the Duke Awards, SouJava winning the JCP Outstanding Award, the Java Band, and even more! It was really a space where the Java community could present what they are doing and what they want to do. There's a lot of interest on the Adopt-a-JSR program and the Adopt-OpenJDK. There's also an Adopt-a-JavaEE-JSR program! Take a look if you want to participate and Make the Future Java. Java EE (JMS, JAX-RS) sessions from Reza Rahman, the HeavyMetal guyReza is a well know professional and Java EE enthusiast from the communitty who just joined Oracle this year. His sessions were very well attended, perhaps because of a high interest on the new things coming to Java EE 7 like JMS 2.0 and JAX-RS 2.0. If you want to look at what he did at this JavaOne edition, read his blog post. By the way, if you like Java and heavymetal, you should follow him on Twitter as well! :-) Java EE (WebSockets, HTML5) sessions from Arun Gupta, the GlassFish guyIf you don't know Arun Gupta, no worries. You will have time to know about him while you read his Java EE 6 Pocket Guide. Arun has been evangelizing Java EE for a long time, and is now spreading his word about the new upcoming version Java EE 7. He gave one talk about HTML5 Productivity on the Java EE 7 platform, and another one on building web apps with WebSockets. Pretty neat! Arun blogged about JavaOne Latin America as well. Read it here. Java Embedded and JavaFXIf there are two things that are really trending in the Java World right now besides Java EE 7, certainly they are JavaFX and Java Embedded. There were 14 talks covering Java Embedded, from Java Cards to Raspberry.pi, from Java ME to Java on your TV with Ginga-J. The Internet of Things is becoming true, and Java is the only platform today that can connect it all in an standardized and concise way. JavaFX gained a lot of attention too. There were 8 sessions covering what the platform has to offer in terms of Rich User Experience. The JavaFX Scene Builder is an awesome tool to start playing designing an UI, and coding for JavaFX is like coding Swing with 8 hands, one holding your coffee cup. You can achieve a lot, with your two hands (unless, you really have 8 hands, then you can achieve 4 times more :-). If you want to read more about JavaFX, go to Stephen Chin's blog post. GlassFish and Friends Party, 1st edition at JavaOne Lating AmericaThis is probably the thing that I'm most proud. We brought to Brasil the tradition of holding a happy hour for all GlassFish, Java EE friends. This party started almost 7 years ago in San Francisco, and it was about time to bring it to Brazil! The party happened on Tuesday night, right after JavaOne General Keynote, at the Tribeca Pub. We had about 80 attendees and met a lot of Java EE developers there! People from JUGs, Oracle, Locaweb and Red Hat showed up too, including some execs from Oracle that didn't resist and could not miss a party like this one.Lots of caipirinhas, beer and food to everyone, some cool music... even The Fish walking around the party with Juggy!You can see more photos from the party on an album I shared with the recently created GlassFish Brasil community on Google+ here (but you may be more interested in joining the GlassFish english community). There's also more pictures that Arun took and shared on this link. So now you may want to consider coming to Brazil next year! Java EE 7 is on its way, and Brazil is happily and patiently waiting for it, with a lot of enthusiasm. By the way, GlassFish and Java EE 6 just celebrated a Happy Birthday!

    Read the article

< Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >