Search Results

Search found 61615 results on 2465 pages for 'execution time'.

Page 574/2465 | < Previous Page | 570 571 572 573 574 575 576 577 578 579 580 581  | Next Page >

  • Introducing Oracle VM VirtualBox

    - by Fat Bloke
    I guess these things always take longer than expected and, while the dust is still not completely settled in all the ex-Sun geographies, it is high time we started looking at some of the great new assets in the Oracle VM portfolio. So let's start with one of the most exciting: Oracle VM VirtualBox. VirtualBox is cross-platform virtualization software, oftentimes called a hypervisor, and it runs on Windows, Linux, Solaris and the Mac. Which means that you download it, you install it on your existing platform, and start creating and running virtual machines alongside your existing applications. For example, on my Mac I can run Oracle Enterprise Linux and Windows 7 alongside my Mac apps like this...(Click to zoom)VirtualBox use has grown phenomenally to the point that at Sun it was the 3rd most popular download behind Java and MySQL. Its success can be attributed to the fact that it doesn't need dedicated hardware, it can be installed on either client or server classes of computers, is very easy to use and is free for personal use. And, as you might expect, VirtualBox has it's own vibrant community too, over at www.virtualbox.org There are hundreds of tutorials out there about how to use VirtualBox to create vm's and install different operating systems ranging from Windows 7 to ChromeOS, and if you don't want to install an operating system yourself, you can download pre-built virtual appliances from community sites such as VirtualBox Images or commercial companies selling subscriptions to whole application stacks, such as JumpBox . In no time you'll be creating and sharing your own vm's using the VirtualBox OVF export and import function. VirtualBox is deceptively powerful. Under the simple GUI lies a formidable engine capable of running heavyweight multi-CPU virtual workloads, exhibiting Enterprise capabilities including a built-in remote display server, an iSCSI initiator for connecting to shared storage, and the ability to teleport running vm's from one host to another. And for solution builders, you should be aware that VirtualBox has a scriptable command line interface and an SDK and rich web service APIs. To get a further feel for what VirtualBox is capable of, check out some of these short movies or simply go download it for yourself.- FB

    Read the article

  • Issue 15: Introducing David Callaghan

    - by rituchhibber
        DAVID'S VIEW INTRODUCING DAVID CALLAGHAN David Callaghan Senior Vice President, Oracle EMEA Alliances and Channels David Callaghan is the Senior Vice President, Alliances & Channels, for Oracle EMEA. He is responsible for all elements of the Oracle Partner Network across the region and leads Oracle as it continues to deliver customer success through the alignment of Oracle's applications and hardware engineered to work together. As I reflect on our last quarter, I thank all our partners for your continued commitment and expertise in embracing the unique opportunity we have before us. The ability to engage with hardware, applications and technology is a real differentiator. We have been able to engage with deep specialization in individual products for some time, which has brought tremendous benefits. But now we can strengthen this further with the broad stack specialization that Oracle on Oracle brings. Now is the time to make that count. While customers are finishing spending this year's budget and planning their spend for the next calendar year, it is now that we need to build the quality opportunities and pipeline for the rest of the year. We have OpenWorld just around the corner with its compelling new product announcements and environment to engage customers at all levels. Make sure you use this event, and every opportunity it brings. In the next quarter you can expect to see targeted 'value creation' campaigns driven by Oracle, and I encourage you to exploit these where they will have greatest impact. My team will be engaging closely with their Oracle sales colleagues to help them leverage the tremendous value you bring, and to develop their ability to work effectively and independently with you, our partners. My team and I are all relentlessly committed to achieving partner, and customer, satisfaction to demonstrate the value of the Passion for Partnering that we all share. With best regards David Back to the welcome page

    Read the article

  • Creating an email notification system based on polling database rows

    - by Ashish Sharma
    I have to design an email notification system based on the following requirements: The email notifications would be created based on polling rows in a Mysql 5.5 DB table when they are in a particular 'Completed' state. The email notification should be sent out in no more than 5 minutes from the time the row was created in the DB table (At the time of DB table row creation the state of the row might not be 'Completed'). Once 5 minutes for the DB table row expire in reaching the 'Completed' state, separate email notification need to be sent (basically telling the user that the original email notification would be delayed) and then sending the email notification as and when the row state reaches to being 'Completed'. The rest of the system requirements are : Adding relevant checks to monitor the whole system via MBeans interface. The system should be scalable so that if the rate of DB table rows creation increases so does the Email notification system be able to ramp up. So I request suggestions on following lines: What approach should I take in solving the problem described from a programming/Design pattern point of view? Suggestion for any third party plugin/software that can be used to solve the problem described? Points to take care regarding scalability and monitoring the health of the system? Java is the language of preference but I am open to using off the shelf components that can be interfaced with Java language or provide standard ports for communication. Currently I do have an in house grown system (written in Java) that is catering to the specified requirements, but it's now crumbling under increased load and now I want to give the problem a fresh look. thanks in advance Ashish

    Read the article

  • How to find keycodes for Fn + keys?

    - by budwiser
    I'm trying to find out the keycode for Fn+? keypress (left arrow). Xev outputs FocusOut event, serial 36, synthetic NO, window 0x3c00001,    mode NotifyGrab, detail NotifyAncestor FocusIn event, serial 36, synthetic NO, window 0x3c00001,    mode NotifyUngrab, detail NotifyAncestor KeymapNotify event, serial 36, synthetic NO, window 0x0,    keys:  4294967213 0   0   0   0   0   0   0   0   0   0   0   0   0   0   0              0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   If it is telling me the keycode here, I'm not able to interpret it so help would be appreciated. I'm also curious for finding out if it's possible to bind something to Fn+Del but when trying out this combination, Xev outputs KeyPress event, serial 36, synthetic NO, window 0x3c00001, root 0xad, subw 0x0, time 1984903, (-666,480), root:(53,533), state 0x0, keycode 119 (keysym 0xffff, Delete), same_screen YES, XLookupString gives 1 bytes: (7f) " " XmbLookupString gives 1 bytes: (7f) " " XFilterEvent returns: False KeyRelease event, serial 36, synthetic NO, window 0x3c00001, root 0xad, subw 0x0, time 1985008, (-666,480), root:(53,533), state 0x0, keycode 119 (keysym 0xffff, Delete), same_screen YES, XLookupString gives 1 bytes: (7f) " " XFilterEvent returns: False which is exactly the same as pressing del without Fn. So, summary for short How can I find keycode for Fn+? (left arrow)? Is it even possible to bind something to Fn+Del or am I facing windmills here?

    Read the article

  • Platform Builder: Removing the Version Information from the Desktop

    - by Bruce Eitman
    The question of how to remove the version information from the desktop has been around for a long time. It came up again today. The question is about the string displayed on the desktop that looks like one of these, depending on the OS verison: Windows Embedded CE v6.00 (Build xxxx on xxxx) Microsoft Windows CE v5.00 (Build xxxx on xxxx) Microsoft Windows CE .NET v4.20 (Build xxxx on xxxx) I have looked into this in the past, but never really had a definitive answer. I have an answer now. The short answer is that the version information is displayed if the code is built without SHIP_BUILD defined.  I have to be honest, I have given this answer in the newsgroups in the past, but I still had questions. My questions have come from different build machines giving different results.   I have noticed that some engineer’s workstations would have the version information displayed, while others did not. I always stopped short of spending time investigating further because our release build machines never resulted in the version information being displayed. But, we do not typically define SHIP_BUILD for our releases because our customers want or need the debug output. So today I dug further into the question. The answer is actually quite simple. Microsoft builds the retail shell libraries with SHIP_BUILD defined and releases the libraries with Platform Builder. Normally the source code does not need to be built during Sysgen, so the libraries that Microsoft delivered are linked to create the Explorer shell. So typically the Explorer shell displays the version information for debug builds, but does not for retail builds. The trouble comes when the source code is forced to be rebuilt for a retail build. This might happen if an engineer uses “Build and Sysgen” or builds the Public\Shell folder from the command line with the clean flag. I am not sure if Build and Sysgen will cause the problem or not – I have never used Build and Sysgen and I strongly advise against using it (see Platform Builder: Don’t use Build and Sysgen) Copyright © 2010 – Bruce Eitman All Rights Reserved

    Read the article

  • Gimme Gimme Gimme!

    - by steve.diamond
    Today is my birthday. And you know, there used to be a time when I dreaded birthdays. But now, as I reach my 37th year (that's my Polar Body Test age), I'm re-learning to really really appreciate being here. Now, what the heck does any of this have to do with CRM or this blog? Easy! Here is the present I would like from you. 1) Please tell us how we're doing on this blog. Do you like what you're seeing? Do you NOT like what you're seeing? Why? What types of topics would you like to see more or less of from us? Do you think we're running too much of an Oracle infomercial here? Conversely, would you like us to spend more time focusing on Oracle solutions? If so, which ones are of most interest to you? 2) Let's assume you DO like what you're seeing and reading here. Please tell a friend. Pass it on. You can write a comment below or submit a comment on our Facebook Fan page (http://facebook.com/oraclecrm). If you're an Oracle employee, please simply send me an email. And if you work here at HQ, bring me some key lime pie. And last but not least, thank you!

    Read the article

  • Conditional Operator Example

    - by mbcrump
    If you haven’t taken the time to learn conditional operators, then now is the time. I’ve added a quick and dirty example for those on the forums.   Code Snippet using System; using System.Net.Mail; using System.Net; using System.Globalization; using System.Windows.Forms;   class Demo {     //Please use conditional statements in your code. See example below.       public static void Main()     {         int dollars = 10;           //Bad Coder Bad !!! Don't do this         if (dollars == 1)         {             Console.WriteLine("Please deposit {0} dollar.", dollars);         }         else         {             Console.WriteLine("Please deposit {0} dollars.", dollars);         }             //Good Coder Good !!! Do this         Console.WriteLine("Please deposit {0} dollar{1}.", dollars, dollars == 1 ? ' ' : 's');         //                                                          expression   ? true : false           Console.ReadLine();          } }

    Read the article

  • How do you KISS ?

    - by Conor
    The KISS principal is a highly quoted design mantra. The aim of this principle is to stamp out unnecessary complexity on a project. This is a good thing, saving time, energy and money. It can lead to a relatively stress free implementation and a simple, elegant, maintainable end product. A lot of discussion on KISS provides mechanisms to simplify requirements, design and implementation. Things that spring to mind include: avoid scope creep; simple obvious design and code; minimal run-time dependencies; refactoring to maintain simplicity; etc. However there are a lot of implicit things that we do to KISS. Examples: small team sizes; minimal management layers; tidy desk; mastery of a single IDE; clear concise error messages; scripts to automate/encapsulate tasks; etc The purpose of this question is to derive a checklist of KISS items. I'm especially interested in non-obvious items.

    Read the article

  • Third year in a row- Microsoft MVP again!!

    - by Jalpesh P. Vadgama
    Today is Sunday and I was not expecting this as today is holiday although I know it was Microsoft Mvp renewal day. At evening I got the congratulation email from the Microsoft. Yeah!! I am Microsoft Most Valuable Professional again. I got the same message as a part of Mvp. Thanks Microsoft again. Dear Jalpesh Vadgama, Congratulations! We are pleased to present you with the 2012 Microsoft® MVP Award! This award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others. We appreciate your outstanding contributions in Visual C# technical communities during the past year. Feeling is again same as first time. I am going to dedicated this award to my family. My parents who always inspired me to do new things. My wife who scarifies her time to write blogs. My brother who support me in every possible way.  On this occasion, I would also like to thanks my reader without their support it was no possible to achieve this. Thanks for reading my blog!!. Please do keep reading this. I will try to write as much as possible. I would also like to thanks ‘Tanmay Kapoor’ My Mvp lead for continuous support.     Once again thank you all for your continuous support and love. There are lots of new technologies in Microsoft Stack and I am going to write lots of blog post about all the new stuff. So stay tuned for the same.

    Read the article

  • Accessing Server-Side Data from Client Script: Accessing JSON Data From an ASP.NET Page Using jQuery

    When building a web application, we must decide how and when the browser will communicate with the web server. The ASP.NET WebForms model greatly simplifies web development by providing a straightforward mechanism for exchanging data between the browser and the server. With WebForms, each ASP.NET page's rendered output includes a <form> element that performs a postback to the same page whenever a Button control within the form is clicked, or whenever the user modifies a control whose AutoPostBack property is set to True. On postback, the server sends the entire contents of the web page back to the browser, which then displays this new content. With WebForms we don't need to spend much time or effort thinking about how or when the browser will communicate with the server or how that returned information will be processed by the browser. It just works. While this approach certainly works and has its advantages, it's not without its drawbacks. The primary concern with postback forms is that they require a large amount of information to be exchanged between the browser and the server. Specifically, the browser sends back all of its form fields (including hidden ones, like view state, which may be quite large) and then the server sends back the entire contents of the web page. Granted, there are scenarios where this large quantity of data needs to be exchanged, but in many cases we can use techniques that exchange much less information. However, these techniques necessitate spending more time and effort thinking about how and when to have the browser communicate with the server and intelligently deciding on what information needs to be exchanged. This article, the first in a multi-part series, examines different techniques for accessing server-side data from a browser using client-side script. Throughout this series we will explore alternative ways to expose data on the server so that it can be accessed from the browser using script; we will also examine various tools for communicating with the server from JavaScript, including jQuery and the ASP.NET AJAX library. Read on to learn more! Read More >

    Read the article

  • ADF Bounded Taskflow Activation

    - by Vijay Mohan
    hey guys, It's really been a while since I last blogged. Just came across a hard-to-debug scenario, so thought of sharing it for the benefit of ADF developers.I had a page fragment(jsff) wrapped inside a  bounded taskflow, for which the activation was conditional and was based on a requestScope property (be it a requestScope variable or a property coming from a requestScope bean). As soon as the taskflow activates and page renders the requestScope parameters life span ends. After that, when you raise an event inside the page (click of commandLink, moseHover, valueChange event etc) then for the first time the event gets fired but it fails to affect the change in the page, moreover, for the subsequent times the event itself doesn't get fired. Any guesses as to what could be the culprit..?I guess, I already gave the reason in the initial paragraph. For the first time when the event gets fired, the fwk sees that the page is already lying in inactivate state, so it fails to affect the change and for subsequent times it doesn't even fire the event because it already knew that the page/region is inactive. So, in such a scenario we must use either a pageFlowScope property or transientVO property which could exist till the page's life span.

    Read the article

  • T-SQL Tuesday #53-Matt's Making Me Do This!

    - by Most Valuable Yak (Rob Volk)
    Hello everyone! It's that time again, time for T-SQL Tuesday, the wonderful blog series started by Adam Machanic (b|t). This month we are hosted by Matt Velic (b|t) who asks the question, "Why So Serious?", in celebration of April Fool's Day. He asks the contributors for their dirty tricks. And for some reason that escapes me, he and Jeff Verheul (b|t) seem to think I might be able to write about those. Shocked, I am! Nah, not really. They're absolutely right, this one is gonna be fun! I took some inspiration from Matt's suggestions, namely Resource Governor and Login Triggers.  I've done some interesting login trigger stuff for a presentation, but nothing yet with Resource Governor. Best way to learn it! One of my oldest pet peeves is abuse of the sa login. Don't get me wrong, I use it too, but typically only as SQL Agent job owner. It's been a while since I've been stuck with it, but back when I started using SQL Server, EVERY application needed sa to function. It was hard-coded and couldn't be changed. (welllllll, that is if you didn't use a hex editor on the EXE file, but who would do such a thing?) My standard warning applies: don't run anything on this page in production. In fact, back up whatever server you're testing this on, including the master database. Snapshotting a VM is a good idea. Also make sure you have other sysadmin level logins on that server. So here's a standard template for a logon trigger to address those pesky sa users: CREATE TRIGGER SA_LOGIN_PRIORITY ON ALL SERVER WITH ENCRYPTION, EXECUTE AS N'sa' AFTER LOGON AS IF ORIGINAL_LOGIN()<>N'sa' OR APP_NAME() LIKE N'SQL Agent%' RETURN; -- interesting stuff goes here GO   What can you do for "interesting stuff"? Books Online limits itself to merely rolling back the logon, which will throw an error (and alert the person that the logon trigger fired).  That's a good use for logon triggers, but really not tricky enough for this blog.  Some of my suggestions are below: WAITFOR DELAY '23:59:59';   Or: EXEC sp_MSforeach_db 'EXEC sp_detach_db ''?'';'   Or: EXEC msdb.dbo.sp_add_job @job_name=N'`', @enabled=1, @start_step_id=1, @notify_level_eventlog=0, @delete_level=3; EXEC msdb.dbo.sp_add_jobserver @job_name=N'`', @server_name=@@SERVERNAME; EXEC msdb.dbo.sp_add_jobstep @job_name=N'`', @step_id=1, @step_name=N'`', @command=N'SHUTDOWN;'; EXEC msdb.dbo.sp_start_job @job_name=N'`';   Really, I don't want to spoil your own exploration, try it yourself!  The thing I really like about these is it lets me promote the idea that "sa is SLOW, sa is BUGGY, don't use sa!".  Before we get into Resource Governor, make sure to drop or disable that logon trigger. They don't work well in combination. (Had to redo all the following code when SSMS locked up) Resource Governor is a feature that lets you control how many resources a single session can consume. The main goal is to limit the damage from a runaway query. But we're not here to read about its main goal or normal usage! I'm trying to make people stop using sa BECAUSE IT'S SLOW! Here's how RG can do that: USE master; GO CREATE FUNCTION dbo.SA_LOGIN_PRIORITY() RETURNS sysname WITH SCHEMABINDING, ENCRYPTION AS BEGIN RETURN CASE WHEN ORIGINAL_LOGIN()=N'sa' AND APP_NAME() NOT LIKE N'SQL Agent%' THEN N'SA_LOGIN_PRIORITY' ELSE N'default' END END GO CREATE RESOURCE POOL SA_LOGIN_PRIORITY WITH ( MIN_CPU_PERCENT = 0 ,MAX_CPU_PERCENT = 1 ,CAP_CPU_PERCENT = 1 ,AFFINITY SCHEDULER = (0) ,MIN_MEMORY_PERCENT = 0 ,MAX_MEMORY_PERCENT = 1 -- ,MIN_IOPS_PER_VOLUME = 1 ,MAX_IOPS_PER_VOLUME = 1 -- uncomment for SQL Server 2014 ); CREATE WORKLOAD GROUP SA_LOGIN_PRIORITY WITH ( IMPORTANCE = LOW ,REQUEST_MAX_MEMORY_GRANT_PERCENT = 1 ,REQUEST_MAX_CPU_TIME_SEC = 1 ,REQUEST_MEMORY_GRANT_TIMEOUT_SEC = 1 ,MAX_DOP = 1 ,GROUP_MAX_REQUESTS = 1 ) USING SA_LOGIN_PRIORITY; ALTER RESOURCE GOVERNOR WITH (CLASSIFIER_FUNCTION=dbo.SA_LOGIN_PRIORITY); ALTER RESOURCE GOVERNOR RECONFIGURE;   From top to bottom: Create a classifier function to determine which pool the session should go to. More info on classifier functions. Create the pool and provide a generous helping of resources for the sa login. Create the workload group and further prioritize those resources for the sa login. Apply the classifier function and reconfigure RG to use it. I have to say this one is a bit sneakier than the logon trigger, least of all you don't get any error messages.  I heartily recommend testing it in Management Studio, and click around the UI a lot, there's some fun behavior there. And DEFINITELY try it on SQL 2014 with the IO settings included!  You'll notice I made allowances for SQL Agent jobs owned by sa, they'll go into the default workload group.  You can add your own overrides to the classifier function if needed. Some interesting ideas I didn't have time for but expect you to get to before me: Set up different pools/workgroups with different settings and randomize which one the classifier chooses Do the same but base it on time of day (Books Online example covers this)... Or, which workstation it connects from. This can be modified for certain special people in your office who either don't listen, or are attracted (and attractive) to you. And if things go wrong you can always use the following from another sysadmin or Dedicated Admin connection: ALTER RESOURCE GOVERNOR DISABLE;   That will let you go in and either fix (or drop) the pools, workgroups and classifier function. So now that you know these types of things are possible, and if you are tired of your team using sa when they shouldn't, I expect you'll enjoy playing with these quite a bit! Unfortunately, the aforementioned Dedicated Admin Connection kinda poops on the party here.  Books Online for both topics will tell you that the DAC will not fire either feature. So if you have a crafty user who does their research, they can still sneak in with sa and do their bidding without being hampered. Of course, you can still detect their login via various methods, like a server trace, SQL Server Audit, extended events, and enabling "Audit Successful Logins" on the server.  These all have their downsides: traces take resources, extended events and SQL Audit can't fire off actions, and enabling successful logins will bloat your error log very quickly.  SQL Audit is also limited unless you have Enterprise Edition, and Resource Governor is Enterprise-only.  And WORST OF ALL, these features are all available and visible through the SSMS UI, so even a doofus developer or manager could find them. Fortunately there are Event Notifications! Event notifications are becoming one of my favorite features of SQL Server (keep an eye out for more blogs from me about them). They are practically unknown and heinously underutilized.  They are also a great gateway drug to using Service Broker, another great but underutilized feature. Hopefully this will get you to start using them, or at least your enemies in the office will once they read this, and then you'll have to learn them in order to fix things. So here's the setup: USE msdb; GO CREATE PROCEDURE dbo.SA_LOGIN_PRIORITY_act WITH ENCRYPTION AS DECLARE @x XML, @message nvarchar(max); RECEIVE @x=CAST(message_body AS XML) FROM SA_LOGIN_PRIORITY_q; IF @x.value('(//LoginName)[1]','sysname')=N'sa' AND @x.value('(//ApplicationName)[1]','sysname') NOT LIKE N'SQL Agent%' BEGIN -- interesting activation procedure stuff goes here END GO CREATE QUEUE SA_LOGIN_PRIORITY_q WITH STATUS=ON, RETENTION=OFF, ACTIVATION (PROCEDURE_NAME=dbo.SA_LOGIN_PRIORITY_act, MAX_QUEUE_READERS=1, EXECUTE AS OWNER); CREATE SERVICE SA_LOGIN_PRIORITY_s ON QUEUE SA_LOGIN_PRIORITY_q([http://schemas.microsoft.com/SQL/Notifications/PostEventNotification]); CREATE EVENT NOTIFICATION SA_LOGIN_PRIORITY_en ON SERVER WITH FAN_IN FOR AUDIT_LOGIN TO SERVICE N'SA_LOGIN_PRIORITY_s', N'current database' GO   From top to bottom: Create activation procedure for event notification queue. Create queue to accept messages from event notification, and activate the procedure to process those messages when received. Create service to send messages to that queue. Create event notification on AUDIT_LOGIN events that fire the service. I placed this in msdb as it is an available system database and already has Service Broker enabled by default. You should change this to another database if you can guarantee it won't get dropped. So what to put in place for "interesting activation procedure code"?  Hmmm, so far I haven't addressed Matt's suggestion of writing a lengthy script to send an annoying message: SET @[email protected]('(//HostName)[1]','sysname') + N' tried to log in to server ' + @x.value('(//ServerName)[1]','sysname') + N' as SA at ' + @x.value('(//StartTime)[1]','sysname') + N' using the ' + @x.value('(//ApplicationName)[1]','sysname') + N' program. That''s why you''re getting this message and the attached pornography which' + N' is bloating your inbox and violating company policy, among other things. If you know' + N' this person you can go to their desk and hit them, or use the following SQL to end their session: KILL ' + @x.value('(//SPID)[1]','sysname') + N'; Hopefully they''re in the middle of a huge query that they need to finish right away.' EXEC msdb.dbo.sp_send_dbmail @recipients=N'[email protected]', @subject=N'SA Login Alert', @query_result_width=32767, @body=@message, @query=N'EXEC sp_readerrorlog;', @attach_query_result_as_file=1, @query_attachment_filename=N'UtterlyGrossPorn_SeriouslyDontOpenIt.jpg' I'm not sure I'd call that a lengthy script, but the attachment should get pretty big, and I'm sure the email admins will love storing multiple copies of it.  The nice thing is that this also fires on Dedicated Admin connections! You can even identify DAC connections from the event data returned, I leave that as an exercise for you. You can use that info to change the action taken by the activation procedure, and since it's a stored procedure, it can pretty much do anything! Except KILL the SPID, or SHUTDOWN the server directly.  I'm still working on those.

    Read the article

  • SQL SERVER – Installing Data Quality Services (DQS) on SQL Server 2012

    - by pinaldave
    Data Quality Services is very interesting enhancements in SQL Server 2012. My friend and SQL Server Expert Govind Kanshi have written an excellent article on this subject earlier on his blog. Yesterday I stumbled upon his blog one more time and decided to experiment myself with DQS. I have basic understanding of DQS and MDS so I knew I need to start with DQS Client. However, when I tried to find DQS Client I was not able to find it under SQL Server 2012 installation. I quickly realized that I needed to separately install the DQS client. You will find the DQS installer under SQL Server 2012 >> Data Quality Services directory. The pre-requisite of DQS is Master Data Services (MDS) and IIS. If you have not installed IIS, you can follow the simple steps and install IIS in your machine. Once the pre-requisites are installed, click on MDS installer once again and it will install DQS just fine. Be patient with the installer as it can take a bit longer time if your machine is low on configurations. Once the installation is over you will be able to expand SQL Server 2012 >> Data Quality Services directory and you will notice that it will have a new item called Data Quality Client.  Click on it and it will open the client. Well, in future blog post we will go over more details about DQS and detailed practical examples. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL, Technology Tagged: Data Quality Services

    Read the article

  • MSCC: Purpose and benefits of Version Control Systems (VCS)

    You're working in IT and not using any kind of version control system? Sorry, then you're doing something wrong! RSVP for MSCC meetup of June This month's meetup will be an introduction into the mechanics of version control systems (VCS) like git, Mercurial, TFS, and others in general. VCS are not optional but compulsory in any area of IT. Whether you're developing source code for the next buzz app, writing SQL scripts for your database, or automating your administrative tasks with shell scripts it's better to have a "time machine" in order to keep multiple version, stay organised and leverage the power of differences. git - a modern approach to VCS - Nayar Nayar is going to give us a brief overview of the basic principles of working with git. Which are the necessary steps to get started and which are the usual commands in order to get the most out of git. Visual Studio Online (VSO) - Jochen Are you mainly rooted on the Windows platform and looking for a good alternative to Team Foundation Services (TFS), then VSO might give you hand at achieving this. Similar to git VSO is an open infrastructure but plays very well together with the Microsoft Azure cloud infrastructure. Recent and upcoming events in Mauritius Let's have a chat about recent events like WebCup 2014 or Emtel Knowledge Series and have a head start on upcoming events like Code Challenge, and others to come. Networking and general discussions Of course, there will be plenty of time to chat and exchange with other like-minded craftsmen. Bring your topics and discuss various issues with other professionals. Share your experience and use the ability to learn from others. Looking forward to meet soon.

    Read the article

  • How do i make an AJAX block crawlable?

    - by Vikas Gulati
    I have a block with a few tabs. When the user clicks the tab the content of that block get loaded. Now I would like to make it crawlable by the search engines and at the same time I want to maintain the good user-experience. I figured out a couple of alternative but each one has its own shortcomings. The approached that i could come up with. Use hashbangs and then use this. But hashbangs are not good and things of past now. Secondly it will make my content crawlable by only googlebot as yahoo and bing dont support this. Use GET PARAMETERIZED fallback incase when javascript doesn't work. This will work for all bots and also would be nice as it would work without javascript. But then this will create duplicates of my page as this block is only a very small section of my page and i have like around 5-6 tabs. So it means that many duplicates! Doing this without AJAX is not an option as it would only increase the page load time as all these blocks have heavy media content in them!

    Read the article

  • ubuntu one not syncing

    - by Martin
    I am really starting to despair as I have been trying ubuntu one for several months, trying it on several machines, and it has caused me loads of different issues wasting me a lot of time. It is not straight forward to use, it should be a piece of software that runs in background and users should not think about checking all the time if it is really doing it's job. Of course I have been searching around this website and other forums but couldn't find an answer to my situation. Yesterday I had several problems with the client not syncing and using a lot of the machine's RAM, up and CPU. I had to reboot on several occasions and leave the office's PC on overnight in order to sync a few files of not more than a few MB. Today I am experiencing another problem: I have decided to do a test putting a small file in my ubuntu one shared folder. Ubuntu one is not detecting it (now already more than an hour), therefore not uploading it to the server. martin@ubuntu-desktop:~$ u1sdtool --status State: QUEUE_MANAGER connection: With User With Network description: processing the commands pool is_connected: True is_error: False is_online: True queues: IDLE and martin@ubuntu-desktop:~$ u1sdtool --current-transfers Current uploads: 0 Current downloads: 0 I am running Ubuntu 11.04 64 with all recent updates. On my other machine the transfer of files seems to be completely frozen, with around 10 files in the queue but no transfer whatsoever. Another curious issue is on my Ubuntu 10.10 laptop where ubuntu one seems to have completly disappeared from Nautilus context menu, folder/file sync status icons missing. I have therefore been forced to upgrade to 11.04 on this machine. Anyway, now I would like to solve the ** processing the commands pool ** issue and make sure the client

    Read the article

  • Elevating Customer Experience through Enterprise Social Networking

    - by john.brunswick
    I am not sure about most people, but I really dislike automated call center routing systems. They are impersonal and convey a sense that the company I am dealing with does not see the value of providing customer service that increases positive perception of their brand. By the time I am connected with a live support representative I am actually more frustrated than before I originally dialed. Each time a company interacts with its customers or prospects there is an opportunity to enhance that relationship. Technical enablers like call center routing systems can be a double edged sword - providing process efficiencies, but removing the human context of some interactions that can build a lot of long term value and create substantial repeat business. Certain web systems, available through "chat with a representative" now links on some web sites, provide a quick and easy way to get in touch with someone and cut down on help desk calls, but miss the opportunity to deliver an even more personal experience to customers and prospects. As more and more users head to the web for self-service and product information, the quality of this interaction becomes critical to supporting a company's brand image and viability. It takes very little effort to go a step further and elevate customer experience, without adding significant cost through social enterprise software technologies. Enterprise Social Networking Social networking technologies have slowly gained footholds in the enterprise, evolving from something that people may have been simply curious about, to tools that have started to provide tangible value in the enterprise. Much like instant messaging, once considered a toy in the enterprise, expertise search, blogs as communications tools, wikis for tacit knowledge sharing are all seeing adoption in a way that is directly applicable to the business and quickly adding value. So where does social networking come in when trying to enhance customer experience?

    Read the article

  • PO Communication in PDF

    - by Robert Story
    Upcoming WebcastsDate: March 29, 2010 Time: 2 pm London, 9:00 am EDT, 6:00 am PDT, 13:00 GMT Click here to register for this sessionDate: March 29, 2010 Time: 9 am London, 4:00 am EDT, 1:00 am PDT, 8:00 GMT Click here to register for this session Product Family: ProcurementSummary This one-hour session is recommended for technical and functional users who would like to know about the PO Communication functionality in procurement. Topics will include: Introduction to PO PDF communication - 11.5.10 Key ConceptsPrerequisites, Scope Overview of PDF document generation PDF solution overviewTechnical Overview of PDF generation Setup steps Triggering Points of PDF generation PO Output for communication - Concurrent programEnter PO form: View DocIsupplier portal/Contracts preview Enhancements PDF Generation in Custom LayoutsAttachments in fax communicationR12 Communication Nontext Attachments through Email Customizing templates Advantages of PDF communication Troubleshooting (Tips) A short, live demonstration (only if applicable) and question and answer period will be included........ ....... ....... ....... ....... ....... .......The above webcast is a service of the E-Business Suite Communities in My Oracle Support.For more information on other webcasts, please reference the Oracle Advisor Webcast Schedule.Click here to visit the E-Business Communities in My Oracle Support Note that all links require access to My Oracle Support.

    Read the article

  • SQL SERVER – Copy Column Headers from Resultset – SQL in Sixty Seconds #027 – Video

    - by pinaldave
    SQL Server Management Studio returns results in Grid View, Text View and to the file. When we copy results from Grid View to Excel there is a common complaint that the column  header displayed in resultset is not copied to the Excel. I often spend time in performance tuning databases and I run many DMV’s in SSMS to get a quick view of the server. In my case it is almost certain that I need all the time column headers when I copy my data to excel or any other place. SQL Server Management Studio have two different ways to do this. Method 1: Ad-hoc When result is rendered you can right click on the resultset and click on Copy Header. This will copy the headers along with the resultset. Additionally, you can use the shortcut key CTRL+SHIFT+C for coping column headers along with the resultset. Method 2: Option Setting at SSMS level This is SSMS level settings and I kept this option always selected as I often need the column headers when I select the resultset. Go Tools >> Options >> Query Results >> SQL Server >> Results to Grid >> Check the Box “Include column header when copying or saving the results.” Both of the methods are discussed in following SQL in Sixty Seconds Video. Here is the code used in the video. Related Tips in SQL in Sixty Seconds: Copy Column Headers in Query Analyzers in Result Set Getting Columns Headers without Result Data – SET FMTONLY ON If we like your idea we promise to share with you educational material. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • The future for Microsoft

    - by Scott Dorman
    Originally posted on: http://geekswithblogs.net/sdorman/archive/2013/10/16/the-future-for-microsoft.aspxMicrosoft is in the process of reinventing itself. While some may argue that it’s “too little, too late” or that their growing consumer-focused strategy is wrong, the truth of the situation is that Microsoft is reinventing itself into a new company. While Microsoft is now calling themselves a “devices and services” company, that’s not entirely accurate. Let’s look at some facts: Microsoft will always (for the long-term foreseeable future) be financially split into the following divisions: Windows/Operating Systems, which for FY13 made up approximately 24% of overall revenue. Server and Tools, which for FY13 made up approximately 26% of overall revenue. Enterprise/Business Products, which for FY13 made up approximately 32% of overall revenue. Entertainment and Devices, which for FY13 made up approximately 13% of overall revenue. Online Services, which for FY13 made up approximately 4% of overall revenue. It is important to realize that hardware products like the Surface fall under the Windows/Operating Systems division while products like the Xbox 360 fall under the Entertainment and Devices division. (Presumably other hardware, such as mice, keyboards, and cameras, also fall under the Entertainment and Devices division.) It’s also unclear where Microsoft’s recent acquisition of Nokia’s handset division will fall, but let’s assume that it will be under Entertainment and Devices as well. Now, for the sake of argument, let’s assume a slightly different structure that I think is more in line with how Microsoft presents itself and how the general public sees it: Consumer Products and Devices, which would probably make up approximately 9% of overall revenue. Developer Tools, which would probably make up approximately 13% of overall revenue. Enterprise Products and Devices, which would probably make up approximately 47% of overall revenue. Entertainment, which would probably make up approximately 13% of overall revenue. Online Services, which would probably make up approximately 17% of overall revenue. (Just so we’re clear, in this structure hardware products like the Surface, a portion of Windows sales, and other hardware fall under the Consumer Products and Devices division. I’m assuming that more of the income for the Windows division is coming from enterprise/volume licenses so 15% of that income went to the Enterprise Products and Devices division. Most of the enterprise services, like Azure, fall under the Online Services division so half of the Server and Tools income went there as well.) No matter how you look at it, the bulk of Microsoft’s income still comes from not just the enterprise but also software sales, and this really shouldn’t surprise anyone. So, now that the stage is set…what’s the future for Microsoft? The future I see for Microsoft (again, this is just my prediction based on my own instinct, gut-feel and publicly available information) is this: Microsoft is becoming a consumer-focused enterprise company. Let’s look at it a different way. Microsoft is an enterprise-focused company trying to create a larger consumer presence.  To a large extent, this is the exact opposite of Apple, who is really a consumer-focused company trying to create a larger enterprise presence. The major reason consumer-focused companies (like Apple) have started making in-roads into the enterprise is the “bring your own device” phenomenon. Yes, Apple has created some “game-changing” products but their enterprise influence is still relatively small. Unfortunately (for this blog post at least), Apple provides revenue in terms of hardware products rather than business divisions, so it’s not possible to do a direct comparison. However, in the interest of transparency, from Apple’s Quarterly Report (filed 24 July 2013), their revenue breakdown is: iPhone, which for the 3 months ending 29 June 2013 made up approximately 51% of revenue. iPad, which for the 3 months ending 29 June 2013 made up approximately 18% of revenue. Mac, which for the 3 months ending 29 June 2013 made up approximately 14% of revenue. iPod, which for the 3 months ending 29 June 2013 made up approximately 2% of revenue. iTunes, Software, and Services, which for the 3 months ending 29 June 2013 made up approximately 11% of revenue. Accessories, which for the 3 months ending 29 July 2013 made up approximately 3% of revenue. From this, it’s pretty clear that Apple is a consumer-and-hardware-focused company. At this point, you may be asking yourself “Where is all of this going?” The answer to that lies in Microsoft’s shift in company focus. They are becoming more consumer focused, but what exactly does that mean? The biggest change (at least that’s been in the news lately) is the pending purchase of Nokia’s handset division. This, in combination with their Surface line of tablets and the Xbox, will put Microsoft squarely in the realm of a hardware-focused company in addition to being a software-focused company. That can (and most likely will) shift the revenue split to looking at revenue based on software sales (both consumer and enterprise) and also hardware sales (mostly on the consumer side). If we look at things strictly from a Windows perspective, Microsoft clearly has a lot of irons in the fire at the moment. Discounting the various product SKUs available and painting the picture with broader strokes, there are currently 5 different Windows-based operating systems: Windows Phone Windows Phone 7.x, which runs on top of the Windows CE kernel Windows Phone 8.x+, which runs on top of the Windows 8 kernel Windows RT The ARM-based version of Windows 8, which runs on top of the Windows 8 kernel Windows (Pro) The Intel-based version of Windows 8, which runs on top of the Windows 8 kernel Xbox The Xbox 360, which runs it’s own proprietary OS. The Xbox One, which runs it’s own proprietary OS, a version of Windows running on top of the Windows 8 kernel and a proprietary “manager” OS which manages the other two. Over time, Windows Phone 7.x devices will fade so that really leaves 4 different versions. Looking at Windows RT and Windows Phone 8.x paints an interesting story. Right now, all mobile phone devices run on some sort of ARM chip and that doesn’t look like it will change any time soon. That means Microsoft has two different Windows based operating systems for the ARM platform. Long term, it doesn’t make sense for Microsoft to continue supporting that arrangement. I have long suspected (since the Surface was first announced) that Microsoft will unify these two variants of Windows and recent speculation from some of the leading Microsoft watchers lends credence to this suspicion. It is rumored that upcoming Windows Phone releases will include support for larger screen sizes, relax the requirement to have a hardware-based back button and will continue to improve API parity between Windows Phone and Windows RT. At the same time, Windows RT will include support for smaller screen sizes. Since both of these operating systems are based on the same core Windows kernel, it makes sense (both from a financial and development resource perspective) for Microsoft to unify them. The user interfaces are already very similar. So similar in fact, that visually it’s difficult to tell them apart. To illustrate this, here are two screen captures: Other than a few variations (the Bing News app, the picture shown in the Pictures tile and the spacing between the tiles) these are identical. The one on the left is from my Windows 8.1 laptop (which looks the same as on my Surface RT) and the one on the right is from my Windows Phone 8 Lumia 925. This pretty clearly shows that from a consumer perspective, there really is no practical difference between how these two operating systems look and how you interact with them. For the consumer, your entertainment device (Xbox One), phone (Windows Phone) and mobile computing device (Surface [or some other vendors tablet], laptop, netbook or ultrabook) and your desktop computing device (desktop) will all look and feel the same. While many people will denounce this consistency of user experience, I think this will be a good thing in the long term, especially for the upcoming generations. For example, my 5-year old son knows how to use my tablet, phone and Xbox because they all feature nearly identical user experiences. When Windows 8 was released, Microsoft allowed a Windows Store app to be purchased once and installed on as many as 5 devices. With Windows 8.1, this limit has been increased to over 50. Why is that important? If you consider that your phone, computing devices, and entertainment device will be running the same operating system (with minor differences related to physical hardware chipset), that means that I could potentially purchase my sons favorite Angry Birds game once and be able to install it on all of the devices I own. (And for those of you wondering, it’s only 7 [at the moment].) From an app developer perspective, the story becomes even more compelling. Right now there are differences between the different operating systems, but those differences are shrinking. The user interface technology for both is XAML but there are different controls available and different user experience concepts. Some of the APIs available are the same while some are not. You can’t develop a Windows Phone app that can also run on Windows (either Windows Pro or RT). With each release of Windows Phone and Windows RT, those difference become smaller and smaller. Add to this mix the Xbox One, which will also feature a Windows-based operating system and the same “modern” (tile-based) user interface and the visible distinctions between the operating systems will become even smaller. Unifying the operating systems means one set of APIs and one code base to maintain for an app that can run on multiple devices. One code base means it’s easier to add features and fix bugs and that those changes become available on all devices at the same time. It also means a single app store, which will increase the discoverability and reach of your app and consolidate revenue and app profile management. Now, the choice of what devices an app is available on becomes a simple checkbox decision rather than a technical limitation. Ultimately, this means more apps available to consumers, which is always good for the app ecosystem. Is all of this just rumor, speculation and conjecture? Of course, but it’s not unfounded. As I mentioned earlier, some of the prominent Microsoft watchers are also reporting similar rumors. However, Microsoft itself has even hinted at this future with their recent organizational changes and by telling developers “if you want to develop for Xbox One, start developing for Windows 8 now.” I think this pretty clearly paints the following picture: Microsoft is committed to the “modern” user interface paradigm. Microsoft is changing their release cadence (for all products, not just operating systems) to be faster and more modular. Microsoft is going to continue to unify their OS platforms both from a consumer perspective and a developer perspective. While this direction will certainly concern some people it will excite many others. Microsoft’s biggest failing has always been following through with a strong and sustained marketing strategy that presents a consistent view point and highlights what this unified and connected experience looks like and how it benefits consumers and enterprises. We’ve started to see some of this over the last few years, but it needs to continue and become more aggressive and consistent. In the long run, I think Microsoft will be able to pull all of these technologies and devices together into one seamless ecosystem. It isn’t going to happen overnight, but my prediction is that we will be there by the end of 2016. As both a consumer and a developer, I, for one, am excited about the future of Microsoft.

    Read the article

  • Oracle Enterprise Pack for Eclipse (OEPE) 11.1.1.7 adds Oracle ADF Tooling Support

    - by greg.stachnick
    Oracle Enterprise Pack for Eclipse (OEPE) 11.1.1.7 is now available and includes first-time support for Oracle ADF development in Eclipse. Installers for OEPE 11.1.1.7 as well as Eclipse Update instructions can be found on the OEPE downloads page. Here is an overview of the new features of OEPE 11.1.1.7: Support for Oracle ADF Faces Oracle Enterprise Pack for Eclipse (OEPE) 11.1.1.7 now provides support for development with Oracle ADF 11.1.1.4. These features focus on enablement and configuration of the ADF Runtime with Eclipse and WebLogic Server 10.3.4 as well as design time tools for ADF Faces. A new OEPE 11.1.1.7 installer bundles WebLogic Server 10.3.4, Coherence 3.6, and Oracle ADF 11.1.1.4. New Server Extensions allow you to download and install the ADF Runtime libraries into an existing WebLogic Server from within Eclipse. New Project Templates and Facets are available for ADF Faces development (ADF Web). New ADF validators with QuickFix options will check common descriptors for the appropriate ADF configurations. ADF-enabled JSP templates supporting multiple layouts are available under the New menu. New Remote and Local run/deploy support for ADF applications to WebLogic Server 10.3.4 The Palette now supports drag and drop of ADF Faces and Data Visualization Tools (DVT) tags and includes editors for eash tag configuration. The Eclipse Property Sheet has been enhanced to provide advanced ADF tag configuration. AppXRay dependency engine provides improved validation, code completion, and hyperlink navigation for ADF Faces and DVT Tags The Eclipse Web Page Editor enables a more productive source editing experience for ADF Faces. UI Consolidation for WebLogic Server Tools Oracle Enterprise Pack for Eclipse 11.1.1.7 includes a more streamlined UI for WebLogic Server development. You can now view deployments within the Servers view to understand which modules have been deployed to the domain. The MBean Browser View has been merged with the Servers view enabling easier access to MBean values while still allowing Drag and Drop to WLST scripts. WebLogic Server configuration options have been moved to the Properties window, right-click a server configuration and select Properties.

    Read the article

  • Community Events and Workshops in November 2012 #ssas #tabular #powerpivot

    - by Marco Russo (SQLBI)
    I and Alberto have a busy agenda until the end of the month, but if you are based in Northern Europe there are many chance to meet one of us in the next couple of weeks! Belgium, 20 November 2012 – SQL Server Days 2012 with Marco Russo I will present two sessions in this conference, “Data Modeling for Tabular” and “Querying and Optimizing DAX” Copenhagen, 21-22 November, 2012 – SSAS Tabular Workshop with Alberto Ferrari Alberto will be the speaker for 2 days – you can still register if you want a full immersion! Copenhagen, 21 November 2012 – Free Community Event with Alberto Ferrari (hosted in Microsoft Hellerup) In the evening Alberto will present “Excel 2013 PowerPivot in Action” Munich, 27-28 November 2012 - SSAS Tabular Workshop with Alberto Ferrari The SSAS workshop will run also in Germany, this time in Munich. Also here there is still some seat still available. Munich, 27 November 2012 - Free Community Event with Alberto Ferrari (hosted in Microsoft ) In the evening Alberto will present “Excel 2013 PowerPivot in Action” Moscow, 27-28 November 2012 – TechEd Russia 2012 with Marco Russo I will speak during the keynote on November 27 and I will present two session the day after, “Developing an Analysis Services Tabular Project BI Semantic Model” and “Excel 2013 PowerPivot in Action” Stockholm, 29-30 November 2012 - SSAS Tabular Workshop with Marco Russo I will run this workshop in Stockholm – if you want to register here, hurry up! Few seats still available! Stockholm, 29 November 2012 - Free Community Event (sold-out!) with Marco Russo In the evening I will present “Excel 2013 PowerPivot in Action” If you want to attend a SSAS Tabular Workshop online, you can also register to the Online edition of December 5-6, 2012, which is still in early bird and is scheduled with a friendly time zone for America’s countries (which could be good for Europe too, in case you don’t mind attending a workshop until midnight!).

    Read the article

  • How can I transition from being a "9-5er" to being self-employed?

    - by Stephen Furlani
    Hey, I posted this question last fall about moonlighting, and I feel like I've got a strong case to make to start transitioning from being a Full-Time Employee to being self-employed. So much so, that I find it hard to concentrate at work on the things I'm supposed to be doing. However, self-employment comes with things like no health benefits or guaranteed income... so I don't feel like I can just quit. (At least not in this economy with a house and family). I'm already working 40hrs/wk on my main job, going to school to get my MS, and trying to freelance on weekends and evenings, but I want to give it more time. If I can't take LWOP or just work less than 40hrs/wk I feel like I have to give up self-employment because I just can't give my day job all my best. Would it be reasonable to ask my employer if they can cut my hours (and pay)? Is there something else I can/should do? Has anyone done this transition and had it turn out well? or bad? I am in the USA and I understand answers are not legal advice. Thanks!

    Read the article

  • Where's my MD.070?

    - by Dave Burke
    In a previous Blog entry titled “Where’s My MD.050” I discussed how the OUM Analysis Specification is the “new-and-improved” version of the more traditional Functional Design Document (or MD.050 for Oracle AIM stalwarts). In a similar way, the OUM Design Specification is an evolution of what we used to call the Technical Design Document (or MD.070). Let’s dig a little deeper…… In a traditional software development process, the “Design Task” would include all the time and resources required to design the software component(s), AND to create the final Technical Design Document. However, in OUM, we have created distinct Tasks for pure design work, along with an optional Task for pulling all of that work together into a Design Specification. Some of the Design Tasks shown above will result in their own Work Products (i.e. an Architecture Description), whilst other Tasks would act as “placeholders” for a specific work effort. In any event, the DS.140 Design Specification can include a combination of unique content, along with links to other Work Products, together which enable a complete technical description of the component, or solution, being designed. So next time someone asks “where’s my MD.070” the short answer would be to tell them to read the OUM Task description for DS.140 – Design Specification!

    Read the article

  • Some SharePoint NDA Information

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). Many years ago, at the last to last to last MVP summit, Microsoft was kind enough to share with us what they were thinking wayyyyyyyyyyyy ahead! I specially remember John Durant talking about the specific enhancements planned for SharePoint 2010 development experience. If you haven’t seen John Durant talking on stage, the guy has more enthusiasm than tiger woods in Amsterdam! The energy of his presentations is simply amazing. So, I pulled out my phone, and I snapped a picture! And, I emailed that picture to everyone in the MVP land, and Microsoft land, saying “We have evidence”, i.e. here are the promises that were made, and dammit we’ll see by the time you release SP2010 how many of these do you actually release. Here is the picture ladies and gentlemen -     It’s a good karate chop action shot isn’t it? Of course, we were all immediately warned not to share any of this seriously strictly NDA information at the time. Well, now that the information is out in the world, I can finally share now, this small tidbit of how far ahead Microsoft is thinking in their plans. Frankly, I wouldn’t be surprised, if today that they have a very clear idea what SharePoint vNext will be all about, or should I say vNextvNext? Have fun! Comment on the article ....

    Read the article

< Previous Page | 570 571 572 573 574 575 576 577 578 579 580 581  | Next Page >