Search Results

Search found 25550 results on 1022 pages for 'umbraco development'.

Page 799/1022 | < Previous Page | 795 796 797 798 799 800 801 802 803 804 805 806  | Next Page >

  • 1.5 million Windows 7 phone’s sold…

    - by Boonei
    Microsoft announced that it has sold over 1.5 million windows 7 phone devices. Windows 7 is a new generation of OS. Mobile operators/users/device programmers need to adopt the same. Its not going to be a easy transition because it’s not an advanced/next version of win 6.x for mobile. We have heard that development from Microsoft side for Win 6.x devices will not continue after sometime. Don’t know how long will get the support! Everything in it s quite new, like OS, User interface, XBox sync, and also requires mobile phone companies to run the OS on high end chips, meaning atleast 1GHz. So the user segment occupied by phones like HTC Wild Fire are not the ones targeted.   Hey ! There an is a catch with this magic number 1.5 million…. It depicts only the number of units sold to mobile operators and retailers. It’s not the number of actual units held in consumers hands and activated. The number could improve significantly in 2011 where Sprint and Verizon join the party in United States. Atleast dozen phone models are in line up now in the rest of the world running Win 7 OS. One good things that customers can rejoice is that Microsoft will direly push software updates to all its consumers. Operator will not interfere. We can expect strong sales going forward with just this important point where Google’s Android lacks the same. [Img Credit: Microsoft] This article titled,1.5 million Windows 7 phone’s sold…, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • JavaScript local alias pattern

    - by Bertrand Le Roy
    Here’s a little pattern that is fairly common from JavaScript developers but that is not very well known from C# developers or people doing only occasional JavaScript development. In C#, you can use a “using” directive to create aliases of namespaces or bring them to the global scope: namespace Fluent.IO { using System; using System.Collections; using SystemIO = System.IO; In JavaScript, the only scoping construct there is is the function, but it can also be used as a local aliasing device, just like the above using directive: (function($, dv) { $("#foo").doSomething(); var a = new dv("#bar"); })(jQuery, Sys.UI.DataView); This piece of code is making the jQuery object accessible using the $ alias throughout the code that lives inside of the function, without polluting the global scope with another variable. The benefit is even bigger for the dv alias which stands here for Sys.UI.DataView: think of the reduction in file size if you use that one a lot or about how much less you’ll have to type… I’ve taken the habit of putting almost all of my code, even page-specific code, inside one of those closures, not just because it keeps the global scope clean but mostly because of that handy aliasing capability.

    Read the article

  • SSIS Prehistory video

    - by jamiet
    I’m currently wasting spending my Easter bank holiday putting together my presentation SSIS Dataflow Performance Tuning for the upcoming SQL Bits conference in London and in doing so I’m researching some old material about how the dataflow actually works. Boring as it is I’ve gotten easily sidelined and have chanced upon an old video on Channel 9 entitled Euan Garden - Tour of SQL Server Team (part I). Euan is a former member of the SQL Server team and in this series of videos he walks the halls of the SQL Server building on Microsoft’s Redmond campus talking to some of the various protagonists and in this one he happens upon the SQL Server Integration Services team. The video was shot in 2004 so this is a fascinating (to me anyway) glimpse into the development of SSIS from before it was ever shipped and if you’re a geek like me you’ll really enjoy this behind-the-scenes look into how and why the product was architected. The video is also notable for the presence of the cameraman – none other than the now-rather-more-famous-than-he-was-then Robert Scoble. See it at http://channel9.msdn.com/posts/TheChannel9Team/Euan-Garden-Tour-of-SQL-Server-Team-part-I/ Enjoy! @Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Good Books About Scaling Up Databases/Servers/etc.?

    - by Mehrdad
    I've applied for an internship at a startup company that expects its user base to grow by a large factor in a small amount of time, and so part of their project is to scale everything up so that they're ready: handling more/larger requests efficiently, handling server failures, load balancing, getting more JavaScript to run faster on the client computers, etc. Part of my job will also be figuring out what to do, so it's not obvious what my exact task will be at the moment. I was told that I should start reading up a little more about this so that I would have a little bit of an idea of what to do. What are some good books for me to read on this topic? I have a little bit of experience with the usage of MySQL (and also a little experience with web development), but in no way do I claim any knowledge on the internal workings of databases or distributed systems, so I might need readings more on the introductory side.

    Read the article

  • BUILD 2013&ndash;Day 2 Summary

    - by Tim Murphy
    Originally posted on: http://geekswithblogs.net/tmurphy/archive/2013/06/28/build-2013ndashday-2-summary.aspx Day 1 rocked.  So how could they top that?  By having more goodies to give away!  During the keynote they announced that attendees would get one year of Office 365, 100 GB of SkyDrive and one year of Adobe Cloud Service.  Overall they key note was long with more information shot at you than you could possibly absorb.  They went about 20 minutes over time which made me think that they could have split it to a 3rd keynote and given us a better idea on some of these topics and perhaps addressed the one open question that was floating around Twitter.  That is, what is going to happen with XBox development.  It sounded like there was a quick side mention of that, but I missed it. The rest of the day was packed with great sessions full of Windows 8, Azure and Windows Phone goodness.  I had planned on attending Scott Hanselman’s talk, but they had so many people this they had to push to an overflow room.  Stay tuned from session summaries later. The day was topped off by an attendee party across from the San Francisco Giant’s ball park.  It was kind of quirky and and fun.  They set it up on one of the piers in the bay and had food served by food trucks.  You would be surprised how good the food was.  Add in some pool tables, fooseball, video games, a DJ, a comedian/musician and plenty of spirits and it was a great way to end day 2. del.icio.us Tags: BUILD 2013

    Read the article

  • VS2012 - How to manually convert .NET Class Library to a Portable Class Library

    - by Igor Milovanovic
    The portable libraries are the  response to the growing profile fragmentation in .NET frameworks. With help of portable libraries you can share code between different runtimes without dreadful #ifdef PLATFORM statements or even worse “Add as Link” source file sharing practices. If you have an existing .net class library which you would like to reference from a different runtime (e.g. you have a .NET Framework 4.5 library which you would like to reference from a Windows Store project), you can either create a new portable class library and move the classes there or edit the existing .csproj file and change the XML directly. The following example shows how to convert a .NET Framework 4.5 library to a Portable Class Library. First Unload the Project and change the following settings in the .csproj file: <Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" /> to: <Import Project="$(MSBuildExtensionsPath32)\Microsoft\Portable \$(TargetFrameworkVersion)\Microsoft.Portable.CSharp.targets" /> and add the following keys to the first property group in order to get visual studio to show the framework picker dialog: <ProjectTypeGuids>{786C830F-07A1-408B-BD7F-6EE04809D6DB}; {FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}</ProjectTypeGuids>   After that you can select the frameworks in the Library Tab of the Portable Library:   As last step, delete any framework references from the library as you have them already referenced via the .NET Portable Subset.     [1] Cross-Platform Development with the .NET Framework - http://msdn.microsoft.com/en-us/library/gg597391.aspx [2] Framework Profiles in .NET: http://nitoprograms.blogspot.de/2012/05/framework-profiles-in-net.html

    Read the article

  • SQLAuthority News – Download SQL Server 2008 R2 Upgrade Technical Reference Guide

    - by pinaldave
    I recently come across very interesting white paper written for Microsoft by Solid Quality Mentors. A successful upgrade to SQL Server 2008 R2 should be smooth and trouble-free. To do that smooth transition, you must plan sufficiently for the upgrade and match the complexity of your database application. Otherwise, you risk costly and stressful errors and upgrade problems. SQL Server 2008 R2 Upgrade Technical Reference Guide is one of the best and comprehensive reference guide I have seen on the subject of SQL Server 2008 R2 upgrade. There are so many various subjects discussed about upgrade which one would always wanted to see. You can find the link of why one has to upgrade to SQL Server 2008 R2 over here: Why upgrade to SQL Server 2008 R2. White paper to upgrade to SQL Server 2008 R2 Upgrade Guide. Here is the quick list of content of the white paper. 1. Upgrade Planning and Deployment 2. Management and Development Tools 3. Relational Databases 4. High Availability 5. Database Security 6. Full-Text Search 7. Service Broker 8. Transact-SQL Queries 9. Notification Services 10. SQL Server Express 11. Analysis Services 12. Data Mining 13. Integration Services 14. Reporting Services 15. Other Microsoft Applications and Platforms Appendix 1: Version and Edition Upgrade Paths Appendix 2: Upgrade Planning Deployment and Tasks Checklist This white paper is indeed huge with 490 pages and 151,956 words.As I said, this is one of the most comprehensive white paper ever published on the subject. Just reading this white paper one can learn a lot about SQL Server. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • Managing Social Relationships for the Enterprise – Part 1

    - by kellsey.ruppel
    By Reggie Bradford, Senior Vice President, Oracle  Today, Mark Hurd, President of Oracle, Thomas Kurian, Executive Vice President of Oracle and I discussed the strategic importance of how social media is impacting the enterprise and how it is changing the way customers, prospects employees and investors interact with brands worldwide.  Oracle understands that the consumer is in control and as such, brands must evolve and change to meet growing needs. In addition, according to social media thought leader and Analyst from Altimeter Group, Jeremiah Owyang, companies now average 178 corporate-owned social media accounts. When Oracle added leading social marketing, listening analytics and development tools from Vitrue, Collective Intellect and Involver to its Oracle’s Cloud Services Suite we went beyond providing a single set of tools. We developed an entire framework to include a comprehensive social relationship management suite to help companies move beyond the social enterprise and achieve the social-enabled enterprise.  The fundamental shift from transaction to engagement means that enterprises need not only a social strategy, but should also ensure that the information and data received from social initiatives flow back to marketing, sales, support and service. Doing so enables companies to deliver a proactive and compelling experience and provides analytics to turn engagement into opportunity – and ultimately that opportunity into revenue.  On September 13, 2012, I am delighted to sit down with Jeremiah to further the discussion about how enterprises are addressing social media strategies and managing content.  In addition, we will be taking your questions after the webinar via Twitter (@Oracle, @ReggieBradford, @cfinn, @jowyang). Use #oracle and #socbiz to submit questions and follow the conversation. I look forward to speaking with you and answering your questions online.  For more information about becoming a social-enabled enterprise, visit www.oracle.com/social. And don’t miss the insights of other social business thought leaders at www.oracle.com/goto/socialbusiness.

    Read the article

  • How to backup/restore excluding filestream varbinary in SQL Server 2008?

    - by fdierre
    There is an application used in a production site that uses SQL Server 2008 as its DBMS. The database schema uses Filestream Varbinary to save binary data on the filesystem instead of directly into the DB tables. The point is that now and then it would be useful to copy the production database on development machines, mostly for doing troubleshooting. The database is too big for comfortably moving it around, but it would be ok if it could be moved leaving out the filestream varbinary fields. In other words, I am trying to make an "imperfect" copy of a database: i.e., on the destination database, it is ok to have NULL values instead of the varbinary. Is this possible? I tried looking for the feature on the SQL Server Management studio and did a backup that excludes the filegroup containing the filestream varbinary, but I cannot restore: MSSMS complains that the restore cannot be done because the backup is incomplete (of course). Is it possible to achieve what I am trying to do in some way?

    Read the article

  • VISIT ORACLE LINUX PAVILION @ORACLE OPENWORLD

    - by Zeynep Koch
    Back by popular demand, Oracle will again host the Oracle Linux Pavilionat Oracle OpenWorld from October 1-3. The pavilion will be located in the Exhibition Hall at Moscone South, Booth 1033, next to the Oracle DEMOgrounds and Oracle Linux demopods. At the pavilion a select group of ISVs, IHVs, and SIs will showcase their products that have been Oracle Linux- and/or Oracle VM-certified. These certified products enable customer applications to run faster, thereby saving money.Partners exhibiting their solutions in the Oracle Linux Pavilion include: BeyondTrust: context-aware security intelligence for dynamic IT infrastructures such as cloud, mobile, and virtual technologies Centrify: control, secure, and audit access to cross-platform systems, mobile devices, and applications Data Intensity: cloud services and application management Fujitsu: technology platforms, private cloud, services, ubiquitous and device solutions HP: converged cloud, converged infrastructure, application transformation, and information optimization LSI: intelligent solid-state storage solutions for breakthrough database acceleration Mellanox: InfiniBand and Ethernet end-to-end server and storage interconnect solutions and services for data centers Micro Focus: mainframe solutions, application modernization and development tools, software quality tools NetApp: storage and data management QLogic: high performance networking Teleran: BI and data warehouse management solutions for Oracle Exadata Database Machine and Oracle Database Be sure to pick up your free Oracle Linux and Oracle VM DVD Kit if you visit one of these partners. And speaking of free, be sure to stop by for some cool treats, courtesy of sponsor QLogic: Smoothie Bar on Monday, October 1 from 2:30 p.m. - 5:30p.m. Ice Cream Social on Wednesday, October 3 from 1:00 p.m. - 2:00 p.m. We look forward to seeing you at the pavilion.

    Read the article

  • Promoting Organizational Visibility for SOA and SOA Governance Initiatives – Part I by Manuel Rosa and André Sampaio

    - by JuergenKress
    The costs of technology assets can become significant and the need to centralize, monitor and control the contribution of each technology asset becomes a paramount responsibility for many organizations. Through the implementation of various mechanisms, it is possible to obtain a holistic vision and develop synergies between different assets, empowering their re-utilization and analyzing the impact on the organization caused by IT changes. When the SOA domain is considered, the issue of governance should therefore always come into play. Although SOA governance is mandatory to achieve any measure of SOA success, its value still passes incognito in most organizations, mostly due to the lack of visibility and the detached view of the SOA initiatives. There are a number of problems that jeopardize the visibility of these initiatives: Understanding and measuring the value of SOA governance and its contribution – SOA governance tools are too technical and isolated from other systems. They are inadequate for anyone outside of the domain (Business Analyst, Project Managers, or even some Enterprise Architects), and are especially harsh at the CxO level. Lack of information exchange with the business, other operational areas and project management – It is not only a matter of lack of dialog but also the question of using a common vocabulary (textual or graphic) that is adequate for all the stakeholders. We need to generate information that can be useful for a wider scope of stakeholders like Business and enterprise architectures. In this article we describe how an organization can leverage from the existing best practices, and with the help of adequate exploration and communication tools, achieve and maintain the level of quality and visibility that is required for SOA and SOA governance initiatives. Introduction Understanding and implementing effective SOA governance has become a corporate imperative in order to ensure coherence and the attainment of the basic objectives of SOA initiatives: develop the correct services control costs and risks bound to the development process reduce time-to-market Read the full article here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Mix Forum Technorati Tags: SOA Governance,Link Consulting,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • CentOS 5.8 Server, manual install PHP 5.2.17

    - by Shiro
    I would like manually install PHP 5.2.17. I manage to install httpd and mysql. But when I want to PHP 5.2.17, I could not found a proper guide. These the step I had done with a fresh installation of CentOS 5.8 x86_64 (server & server GUI) yum install httpd httpd-devel /etc/init.d start OK /etc/init.d stop OK yum install mysql mysql-server mysql-devel yum remove php yum groupinstall "Development Tools" yum install libxml2-devel wget http://www.php.net/release/php5.2.17.zip to get php5.2.17 (client requirement must use this version) cd php5.2.17 ./configure --with-apxs2=/usr/sbin/apxs --with-mysql=/usr/local This is the area I confuse. I could not found the /usr/sbin/apxs in my system. I do another Google search on how to manually install PHP, they pointed using ./configure --with-apxs2=/usr/local/apache2/bin/apxs --with-mysql Both localtino I also cannot find apxs or apache2. I scare I make any mistake on it. Please help and guide on this. I am newbie in CentOS

    Read the article

  • Worldwide Web Camps

    - by ScottGu
    Over the next few weeks Microsoft is sponsoring a number of free Web Camp events around the world.  These provide a great way to learn about ASP.NET 4, ASP.NET MVC 2, and Visual Studio 2010. The Web Camps are two day events.  The camps aren’t conferences where you sit quietly for hours and people talk at you – they are intended to be interactive.  The first day is focused on learning through presentations that are heavy on coding demos.  The second day is focused on you building real applications using what you’ve learned.  The second day includes hands-on labs, and you’ll join small development teams with other attendees and work on a project together. We’ve got some great speakers lined up for the events – including Scott Hanselman, James Senior, Jon Galloway, Rachel Appel, Dan Wahlin, Christian Wenz and more.  I’ll also be presenting at one of the camps. Below is the schedule of the remaining events (the sold-out Toronto camp was a few days ago): Moscow May 19-19 Beijing May 21-22 Shanghai May 24-25 Mountain View May 27-28 Sydney May 28-29 Singapore June 04-05 London June 04-05 Munich June 07-08 Chicago June 11-12 Redmond, WA June 18-19 New York June 25-26 Many locations are sold out already but we still have some seats left in a few of them.  Registration and attendance to all of the events is completely free.  You can register to attend at www.webcamps.ms. Hope this helps, Scott

    Read the article

  • Fn + Right Arrow on MacBook Pro does not work as END Key for SuperUser web site

    - by George2
    Hello everyone, I am using a MacBook Pro running Mac OS X 10.5. I am new to this development environment, and previously worked on Windows. I have tried that Fn + Right arrow on Mac works as "END" key on Windows -- which makes cursor move to the end of line, for most application like Word for Mac -- I have tried it works. But I tried Fn + Right Arrow does not work for current question input box of SuperUser. I am wondering how to solve this issue and why? Does anyone have the same issue for me? thanks in advance, George

    Read the article

  • Testing home directory scripts by setting $HOME to the location of the test directory

    - by intuited
    I have an interdependent collection of scripts in my ~/bin directory as well as a developed ~/.vim directory and some other libraries and such in other subdirectories. I've been versioning all of this using git, and have realized that it would be potentially very easy and useful to do development and testing of new and existing scripts, vim plugins, etc. using a cloned repo, and then pull the working code into my actual home directory with a merge. The easiest way to do this would seem to be to just change & export $HOME, eg cd ~/testing; git clone ~ home export HOME=~/testing/home cd ~ screen -S testing-home # start vim, write/revise plugins, edit scripts, etc. # test revisions However since I've never tried this before I'm concerned that some programs, environment variables, etc., may end up using my actual home directory instead of the exported one. Is this a viable strategy? Are there just a few outliers that I should be careful about? Is there a much better way to do this sort of thing?

    Read the article

  • How can I display and log PHP errors on IIS7?

    - by Ben
    We're running PHP 5.2.5 on an IIS 7 Server and we're having problems making PHP errors visible... At the moment whenever we have a PHP error the server sends back a 500 error with the message "The page cannot be displayed because an internal server error has occurred." This might be a good setting for production websites but it's rather annoying on a development server... ;-) I have tried configuring php.ini to display errors to the screen as well as log them to a specific folder but it seems that the Server catches all errors before and prevents and handling by PHP... Does someone know what we have to do to make IIS display PHP errors on screen? Any links, tipps or tutorials on the subject would be appreciated!

    Read the article

  • Microsoft&rsquo;s new technical computing initiative

    - by Randy Walker
    I made a mental note from earlier in the year.  Microsoft literally buys computers by the truckload.  From what I understand, it’s a typical practice amongst large software vendors.  You plug a few wires in, you test it, and you instantly have mega tera tera flops (don’t hold me to that number).  Microsoft has been trying to plug away at their cloud services (named Azure).  Which, for the layman, means Microsoft runs your software on their computers, and as demand increases you can allocate more computing power on the fly. With this in mind, it doesn’t surprise me that I was recently sent an executive email concerning Microsoft’s new technical computing initiative.  I find it to be a great marketing idea with actual substance behind their real work.  From the programmer academic perspective, in college we dreamed about this type of processing power.  This has decades of computer science theory behind it. A copy of the email received.  (note that I almost deleted this email, thinking it was spam due to it’s length) We don't often think about how complex life really is. Take the relatively simple task of commuting to and from work: it is, in fact, a complicated interplay of variables such as weather, train delays, accidents, traffic patterns, road construction, etc. You can however, take steps to shorten your commute - using a good, predictive understanding of a few of these variables. In fact, you probably are already taking these inputs and instinctively building a predictive model that you act on daily to get to your destination more quickly. Now, when we apply the same method to very complex tasks, this modeling approach becomes much more challenging. Recent world events clearly demonstrated our inability to process vast amounts of information and variables that would have helped to more accurately predict the behavior of global financial markets or the occurrence and impact of a volcano eruption in Iceland. To make sense of issues like these, researchers, engineers and analysts create computer models of the almost infinite number of possible interactions in complex systems. But, they need increasingly more sophisticated computer models to better understand how the world behaves and to make fact-based predictions about the future. And, to do this, it requires a tremendous amount of computing power to process and examine the massive data deluge from cameras, digital sensors and precision instruments of all kinds. This is the key to creating more accurate and realistic models that expose the hidden meaning of data, which gives us the kind of insight we need to solve a myriad of challenges. We have made great strides in our ability to build these kinds of computer models, and yet they are still too difficult, expensive and time consuming to manage. Today, even the most complicated data-rich simulations cannot fully capture all of the intricacies and dependencies of the systems they are trying to model. That is why, across the scientific and engineering world, it is so hard to say with any certainty when or where the next volcano will erupt and what flight patterns it might affect, or to more accurately predict something like a global flu pandemic. So far, we just cannot collect, correlate and compute enough data to create an accurate forecast of the real world. But this is about to change. Innovations in technology are transforming our ability to measure, monitor and model how the world behaves. The implication for scientific research is profound, and it will transform the way we tackle global challenges like health care and climate change. It will also have a huge impact on engineering and business, delivering breakthroughs that could lead to the creation of new products, new businesses and even new industries. Because you are a subscriber to executive e-mails from Microsoft, I want you to be the first to know about a new effort focused specifically on empowering millions of the world's smartest problem solvers. Today, I am happy to introduce Microsoft's Technical Computing initiative. Our goal is to unleash the power of pervasive, accurate, real-time modeling to help people and organizations achieve their objectives and realize their potential. We are bringing together some of the brightest minds in the technical computing community across industry, academia and science at www.modelingtheworld.com to discuss trends, challenges and shared opportunities. New advances provide the foundation for tools and applications that will make technical computing more affordable and accessible where mathematical and computational principles are applied to solve practical problems. One day soon, complicated tasks like building a sophisticated computer model that would typically take a team of advanced software programmers months to build and days to run, will be accomplished in a single afternoon by a scientist, engineer or analyst working at the PC on their desktop. And as technology continues to advance, these models will become more complete and accurate in the way they represent the world. This will speed our ability to test new ideas, improve processes and advance our understanding of systems. Our technical computing initiative reflects the best of Microsoft's heritage. Ever since Bill Gates articulated the then far-fetched vision of "a computer on every desktop" in the early 1980's, Microsoft has been at the forefront of expanding the power and reach of computing to benefit the world. As someone who worked closely with Bill for many years at Microsoft, I am happy to share with you that the passion behind that vision is fully alive at Microsoft and is carried out in the creation of our new Technical Computing group. Enabling more people to make better predictions We have seen the impact of making greater computing power more available firsthand through our investments in high performance computing (HPC) over the past five years. Scientists, engineers and analysts in organizations of all sizes and sectors are finding that using distributed computational power creates societal impact, fuels scientific breakthroughs and delivers competitive advantages. For example, we have seen remarkable results from some of our current customers: Malaria strikes 300,000 to 500,000 people around the world each year. To help in the effort to eradicate malaria worldwide, scientists at Intellectual Ventures use software that simulates how the disease spreads and would respond to prevention and control methods, such as vaccines and the use of bed nets. Technical computing allows researchers to model more detailed parameters for more accurate results and receive those results in less than an hour, rather than waiting a full day. Aerospace engineering firm, a.i. solutions, Inc., needed a more powerful computing platform to keep up with the increasingly complex computational needs of its customers: NASA, the Department of Defense and other government agencies planning space flights. To meet that need, it adopted technical computing. Now, a.i. solutions can produce detailed predictions and analysis of the flight dynamics of a given spacecraft, from optimal launch times and orbit determination to attitude control and navigation, up to eight times faster. This enables them to avoid mistakes in any areas that can cause a space mission to fail and potentially result in the loss of life and millions of dollars. Western & Southern Financial Group faced the challenge of running ever larger and more complex actuarial models as its number of policyholders and products grew and regulatory requirements changed. The company chose an actuarial solution that runs on technical computing technology. The solution is easy for the company's IT staff to manage and adjust to meet business needs. The new solution helps the company reduce modeling time by up to 99 percent - letting the team fine-tune its models for more accurate product pricing and financial projections. Our Technical Computing direction Collaborating closely with partners across industry and academia, we must now extend the reach of technical computing even further to help predictive modelers and data explorers make faster, more accurate predictions. As we build the Technical Computing initiative, we will invest in three core areas: Technical computing to the cloud: Microsoft will play a leading role in bringing technical computing power to scientists, engineers and analysts through the cloud. Existing high- performance computing users will benefit from the ability to augment their on-premises systems with cloud resources that enable 'just-in-time' processing. This platform will help ensure processing resources are available whenever they are needed-reliably, consistently and quickly. Simplify parallel development: Today, computers are shipping with more processing power than ever, including multiple cores, but most modern software only uses a small amount of the available processing power. Parallel programs are extremely difficult to write, test and trouble shoot. However, a consistent model for parallel programming can help more developers unlock the tremendous power in today's modern computers and enable a new generation of technical computing. We are delivering new tools to automate and simplify writing software through parallel processing from the desktop... to the cluster... to the cloud. Develop powerful new technical computing tools and applications: We know scientists, engineers and analysts are pushing common tools (i.e., spreadsheets and databases) to the limits with complex, data-intensive models. They need easy access to more computing power and simplified tools to increase the speed of their work. We are building a platform to do this. Our development efforts will yield new, easy-to-use tools and applications that automate data acquisition, modeling, simulation, visualization, workflow and collaboration. This will allow them to spend more time on their work and less time wrestling with complicated technology. Thinking bigger There is so much left to be discovered and so many questions yet to be answered in the fascinating world around us. We believe the technical computing community will show us that we have not seen anything yet. Imagine just some of the breakthroughs this community could make possible: Better predictions to help improve the understanding of pandemics, contagion and global health trends. Climate change models that predict environmental, economic and human impact, accessible in real-time during key discussions and debates. More accurate prediction of natural disasters and their impact to develop more effective emergency response plans. With an ambitious charter in hand, this new team is ready to build on our progress to-date and execute Microsoft's technical computing vision over the months and years ahead. We will steadily invest in the right technologies, tools and talent, and work to bring together the technical computing community. I invite you to visit www.modelingtheworld.com today. We welcome your ideas and feedback. I look forward to making this journey with you and others who want to answer the world's biggest questions, discover solutions to problems that seem impossible and uncover a host of new opportunities to change the world we live in for the better. Bob

    Read the article

  • How to evaluate the quality of Rails code?

    - by Fortuity
    In a code review, what do you look for to assess a developer's expertise? Given an opportunity to look at a developer's work on a real-world project, what tell-tale signs are a tip-off to carelessness or lack of experience? Conversely, where do you look in the code to find evidence of a developer's skill or knowledge of best practices? For example, if I'm looking at a typical Rails app, I would be happy to see the developer is using RSpec (showing a commitment to using test-driven development and knowledge that RSpec is currently more popular than the default TestUnit). But in examining the specs for a Rails model, I see that the developer is testing associations, which might indicate a lack of real understanding of Rails testing requirements (since such tests are redundant given that they only test what's already implemented and tested in ActiveRecord). More generally, I might look to see if developers are writing their own implementations versus using widely available gems or if they are cleaning up code versus leaving lots of commented-out "leftovers." What helps you determine the skill of a Rails developer? What's your code quality checklist?

    Read the article

  • Tutorial: Getting Started with the NoSQL JavaScript / Node.js API for MySQL Cluster

    - by Mat Keep
    Tutorial authored by Craig Russell and JD Duncan  The MySQL Cluster team are working on a new NoSQL JavaScript connector for MySQL. The objectives are simplicity and high performance for JavaScript users: - allows end-to-end JavaScript development, from the browser to the server and now to the world's most popular open source database - native "NoSQL" access to the storage layer without going first through SQL transformations and parsing. Node.js is a complete web platform built around JavaScript designed to deliver millions of client connections on commodity hardware. With the MySQL NoSQL Connector for JavaScript, Node.js users can easily add data access and persistence to their web, cloud, social and mobile applications. While the initial implementation is designed to plug and play with Node.js, the actual implementation doesn't depend heavily on Node, potentially enabling wider platform support in the future. Implementation The architecture and user interface of this connector are very different from other MySQL connectors in a major way: it is an asynchronous interface that follows the event model built into Node.js. To make it as easy as possible, we decided to use a domain object model to store the data. This allows for users to query data from the database and have a fully-instantiated object to work with, instead of having to deal with rows and columns of the database. The domain object model can have any user behavior that is desired, with the NoSQL connector providing the data from the database. To make it as fast as possible, we use a direct connection from the user's address space to the database. This approach means that no SQL (pun intended) is needed to get to the data, and no SQL server is between the user and the data. The connector is being developed to be extensible to multiple underlying database technologies, including direct, native access to both the MySQL Cluster "ndb" and InnoDB storage engines. The connector integrates the MySQL Cluster native API library directly within the Node.js platform itself, enabling developers to seamlessly couple their high performance, distributed applications with a high performance, distributed, persistence layer delivering 99.999% availability. The following sections take you through how to connect to MySQL, query the data and how to get started. Connecting to the database A Session is the main user access path to the database. You can get a Session object directly from the connector using the openSession function: var nosql = require("mysql-js"); var dbProperties = {     "implementation" : "ndb",     "database" : "test" }; nosql.openSession(dbProperties, null, onSession); The openSession function calls back into the application upon creating a Session. The Session is then used to create, delete, update, and read objects. Reading data The Session can read data from the database in a number of ways. If you simply want the data from the database, you provide a table name and the key of the row that you want. For example, consider this schema: create table employee (   id int not null primary key,   name varchar(32),   salary float ) ENGINE=ndbcluster; Since the primary key is a number, you can provide the key as a number to the find function. function onSession = function(err, session) {   if (err) {     console.log(err);     ... error handling   }   session.find('employee', 0, onData); }; function onData = function(err, data) {   if (err) {     console.log(err);     ... error handling   }   console.log('Found: ', JSON.stringify(data));   ... use data in application }; If you want to have the data stored in your own domain model, you tell the connector which table your domain model uses, by specifying an annotation, and pass your domain model to the find function. var annotations = new nosql.Annotations(); function Employee = function(id, name, salary) {   this.id = id;   this.name = name;   this.salary = salary;   this.giveRaise = function(percent) {     this.salary *= percent;   } }; annotations.mapClass(Employee, {'table' : 'employee'}); function onSession = function(err, session) {   if (err) {     console.log(err);     ... error handling   }   session.find(Employee, 0, onData); }; Updating data You can update the emp instance in memory, but to make the raise persistent, you need to write it back to the database, using the update function. function onData = function(err, emp) {   if (err) {     console.log(err);     ... error handling   }   console.log('Found: ', JSON.stringify(emp));   emp.giveRaise(0.12); // gee, thanks!   session.update(emp); // oops, session is out of scope here }; Using JavaScript can be tricky because it does not have the concept of block scope for variables. You can create a closure to handle these variables, or use a feature of the connector to remember your variables. The connector api takes a fixed number of parameters and returns a fixed number of result parameters to the callback function. But the connector will keep track of variables for you and return them to the callback. So in the above example, change the onSession function to remember the session variable, and you can refer to it in the onData function: function onSession = function(err, session) {   if (err) {     console.log(err);     ... error handling   }   session.find(Employee, 0, onData, session); }; function onData = function(err, emp, session) {   if (err) {     console.log(err);     ... error handling   }   console.log('Found: ', JSON.stringify(emp));   emp.giveRaise(0.12); // gee, thanks!   session.update(emp, onUpdate); // session is now in scope }; function onUpdate = function(err, emp) {   if (err) {     console.log(err);     ... error handling   } Inserting data Inserting data requires a mapped JavaScript user function (constructor) and a session. Create a variable and persist it: function onSession = function(err, session) {   var data = new Employee(999, 'Mat Keep', 20000000);   session.persist(data, onInsert);   } }; Deleting data To remove data from the database, use the session remove function. You use an instance of the domain object to identify the row you want to remove. Only the key field is relevant. function onSession = function(err, session) {   var key = new Employee(999);   session.remove(Employee, onDelete);   } }; More extensive queries We are working on the implementation of more extensive queries along the lines of the criteria query api. Stay tuned. How to evaluate The MySQL Connector for JavaScript is available for download from labs.mysql.com. Select the build: MySQL-Cluster-NoSQL-Connector-for-Node-js You can also clone the project on GitHub Since it is still early in development, feedback is especially valuable (so don't hesitate to leave comments on this blog, or head to the MySQL Cluster forum). Try it out and see how easy (and fast) it is to integrate MySQL Cluster into your Node.js platforms. You can learn more about other previewed functionality of MySQL Cluster 7.3 here

    Read the article

  • Windows Server 2008, IIS7 and Windows Authentication

    - by Chalkey
    We currently have a development server set up which we are trying to test some Windows authentication ASP.NET code on. We have turned on Windows Authentication in IIS7 on Windows Server 2008 R2 fine, and it asks the user for a username and password as excepted, but the problem is it doesn't appear to accept any credentials. This code for example... Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load Page.Title = "Home page for " + User.Identity.Name End Sub ...always returns an empty string. One theory we have is that we dont have Active Directory installed as of yet, we are just testing this by logging on via the machine name not a domain. Is this type of authentication only applicatable to domains (if so we can probably install Active Directory and some test accounts) - or is it possible to get the user identity when logging in using the machine name? Ideally we would like to be able to test this on our local machines (Windows 7 Pro) using our own accounts (again these aren't on a domain) and IIS but this has the same issue as our dev server. Thanks,

    Read the article

  • What are Web runtime environments and programming languages

    - by Bradly Spicer
    I've been looking into the details behind these two different categories: Web runtime environments Web application programming languages I believe I have the correct information and have phrased it correctly but I am unsure. I have been searching for a while but only find snippets of information or what I can see as useless information (I could be wrong). Here are my descriptions so far: Web runtime environments - A Run-time environment implements part of the core behaviour of any computer language and allows it to be modified via an API or embedded domain-specific language. A web runtime environment is similar except it uses web based languages such as Java-script which utilises the core behaviour a computer language. Another example of a Run-time environment web language is JsLibs which is a standable JavaScript development runtime environment for using JavaScript as a general all round scripting language. JavaScript is often used to create responsive interfaces which improve the user experience and provide dynamic functionality without having to wait for the server to react and direct to another page. Web application programming languages - A web application program language is something that mimics a traditional desktop application within a web page. For example, using PHP you can create forms and tables which use a database similar to that of Microsoft Excel. Some of the other languages for web application programming are: Ajax Perl Ruby Here are some of the resources used: http://en.wikipedia.org/wiki/Web_application_development http://code.google.com/p/jslibs/ I would like some confirmation that the descriptions I have created are correct as I am still slightly unsure as to whether I have hit the nail on the head.

    Read the article

  • Talking JavaOne with Rock Star Raghavan Srinivas

    - by Janice J. Heiss
    Raghavan Srinivas, affectionately known as “Rags,” is a two-time JavaOne Rock Star (from 2005 and 2011) who, as a Developer Advocate at Couchbase, gets his hands dirty with emerging technology directions and trends. His general focus is on distributed systems, with a specialization in cloud computing. He worked on Hadoop and HBase during its early stages, has spoken at conferences world-wide on a variety of technical topics, conducted and organized Hands-on Labs and taught graduate classes.He has 20 years of hands-on software development and over 10 years of architecture and technology evangelism experience and has worked for Digital Equipment Corporation, Sun Microsystems, Intuit and Accenture. He has evangelized and influenced the architecture of numerous technologies including the early releases of JavaFX, Java, Java EE, Java and XML, Java ME, AJAX and Web 2.0, and Java Security.Rags will be giving these sessions at JavaOne 2012: CON3570 -- Autosharding Enterprise to Social Gaming Applications with NoSQL and Couchbase CON3257 -- Script Bowl 2012: The Battle of the JVM-Based Languages (with Guillaume Laforge, Aaron Bedra, Dick Wall, and Dr Nic Williams) Rags emphasized the importance of the Cloud: “The Cloud and the Big Data are popular technologies not merely because they are trendy, but, largely due to the fact that it's possible to do massive data mining and use that information for business advantage,” he explained. I asked him what we should know about Hadoop. “Hadoop,” he remarked, “is mainly about using commodity hardware and achieving unprecedented scalability. At the heart of all this is the Java Virtual Machine which is running on each of these nodes. The vision of taking the processing to where the data resides is made possible by Java and Hadoop.” And the most exciting thing happening in the world of Java today? “I read recently that Java projects on github.com are just off the charts when compared to other projects. It's exciting to realize the robust growth of Java and the degree of collaboration amongst Java programmers.” He encourages Java developers to take advantage of Java 7 for Mac OS X which is now available for download. At the same time, he also encourages us to read the caveats. Originally published on blogs.oracle.com/javaone.

    Read the article

  • Talking JavaOne with Rock Star Raghavan Srinivas

    - by Janice J. Heiss
    Raghavan Srinivas, affectionately known as “Rags,” is a two-time JavaOne Rock Star (from 2005 and 2011) who, as a Developer Advocate at Couchbase, gets his hands dirty with emerging technology directions and trends. His general focus is on distributed systems, with a specialization in cloud computing. He worked on Hadoop and HBase during its early stages, has spoken at conferences world-wide on a variety of technical topics, conducted and organized Hands-on Labs and taught graduate classes.He has 20 years of hands-on software development and over 10 years of architecture and technology evangelism experience and has worked for Digital Equipment Corporation, Sun Microsystems, Intuit and Accenture. He has evangelized and influenced the architecture of numerous technologies including the early releases of JavaFX, Java, Java EE, Java and XML, Java ME, AJAX and Web 2.0, and Java Security.Rags will be giving these sessions at JavaOne 2012: CON3570 -- Autosharding Enterprise to Social Gaming Applications with NoSQL and Couchbase CON3257 -- Script Bowl 2012: The Battle of the JVM-Based Languages (with Guillaume Laforge, Aaron Bedra, Dick Wall, and Dr Nic Williams) Rags emphasized the importance of the Cloud: “The Cloud and the Big Data are popular technologies not merely because they are trendy, but, largely due to the fact that it's possible to do massive data mining and use that information for business advantage,” he explained. I asked him what we should know about Hadoop. “Hadoop,” he remarked, “is mainly about using commodity hardware and achieving unprecedented scalability. At the heart of all this is the Java Virtual Machine which is running on each of these nodes. The vision of taking the processing to where the data resides is made possible by Java and Hadoop.” And the most exciting thing happening in the world of Java today? “I read recently that Java projects on github.com are just off the charts when compared to other projects. It's exciting to realize the robust growth of Java and the degree of collaboration amongst Java programmers.” He encourages Java developers to take advantage of Java 7 for Mac OS X which is now available for download. At the same time, he also encourages us to read the caveats.

    Read the article

  • Chrome browser caching

    - by Kyle B.
    I do a lot of development on my local machine and would like to start using Chrome, however I cannot seem to do a hard-refresh (ctrl+f5) or any other key combination to get my browser to forcibly refresh all content @ http://localhost. I change projects frequently in IIS and this presents a problem because I see stylesheet and image data from my previous project with no way to get this page to reload without forcibly dumping all cache data from the settings menu. Is there another key combination I am missing, or is there a place I can (on a site by site basis) turn off caching? I prefer not to have to clear out my temporary files in the browser settings as I switch projects frequently. Thanks, Kyle

    Read the article

  • SQL Monitor Alerts in Outlook Without Configuring Email Settings

    - by Fatherjack
    SQL Monitor is a Red Gate tool that I have a long history with and I have worked closely with the development team from a time before it was called SQL Monitor. It is with that history in mind I am a little disappointed in myself that I have only just found out about a pretty cool feature. Out of the box SQL Monitor keeps itself to itself, it busily goes about watching over your servers, noting down when things look suspicious, change drastically or are just out and out wrong. You have to go into the settings and provide email details (SMTP server, account details etc.) before it starts getting at all intrusive with warning and alerts on the condition of your servers. However, it was after installing the most recent version that I was going through the application screen by screen looking for new and interesting changes that I noticed something that had avoided my attention. On the Alerts tab there is an option in the left hand menu. I don’t know how long ago it appeared or why I have never explored it previously but it appears that you can see your Alerts in the format of an RSS feed. Now when you click that link you are taken to a page that is the raw RSS XML – not too interesting but clearly you can use this in an RSS aggregator. Such as Outlook. Note the URL in the newly opened page take it with you into Outlook. For me it is in the form of http://SQLMonitorServerName/Alerts/Inbox/Feed. Again, this is something that I have only recently noticed – Outlook can aggregate RSS feeds. Down below the Inbox, Drafts folders etc, one up from the bottom is RSS Feeds. If you right click that and choose to Add a feed then you can supply the URL for SQL Monitor Alerts: And there you have it, your SQL Monitor Alerts available in Outlook where you can keep an eye on the number of unread items and pick them off at your convenience.

    Read the article

< Previous Page | 795 796 797 798 799 800 801 802 803 804 805 806  | Next Page >