Search Results

Search found 15376 results on 616 pages for 'once'.

Page 202/616 | < Previous Page | 198 199 200 201 202 203 204 205 206 207 208 209  | Next Page >

  • Desktop Fun: Add New Theme Packs to Windows 7

    - by Asian Angel
    One of the wonderful things about Windows 7 is the availability of new themes and with more becoming available each month there are plenty to choose from. Join us as we take a look at sampler set of the great themes that you can download for your system. For the themes shown here we have included a full-screen image and a screenshot showing the wallpapers that are available with each theme. Once you have downloaded the themes simply double click on the theme-pack file to install them. Note: The system “text size and sound schemes” will vary slightly from theme to theme. Cats Anytime Dogs in Summer Tigers Ceske jaro (Czech Spring) Brazil Lugares Coloridos Latvian Nature Srpska priroda (Serbian Nature) Bicycle Ride around Taiwan Bing’s Best Avatar Zune Characters Conclusion If you are looking for an easy way to add some beautiful variety to your Windows 7 installation then head on over to the Microsoft website…you just might find that perfect theme waiting for your computer. Links Windows 7 Themes at Microsoft Ceske jaro (Czech Spring) at Softpedia Similar Articles Productive Geek Tips Windows 7 Welcome Screen Taking Forever? Here’s the Fix (Maybe)Unofficial Windows XP Themes Created by MicrosoftSweet Black Theme for FirefoxDownload New Themes in Windows 7Sweet Black Theme for Windows XP TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Chitika iPad Labs Gives Live iPad Sale Stats Heaven & Hell Finder Icon Using TrueCrypt to Secure Your Data Quickly Schedule Meetings With NeedtoMeet Share Flickr Photos On Facebook Automatically Are You Blocked On Gtalk? Find out

    Read the article

  • Kill your temp tables using keyboard shortcuts : SSMS

    - by jamiet
    Here’s a nifty little SSMS trick that my colleague Tom Hunter educated me on the other day and I thought it was worth sharing. If you’re a keyboard shortcut junkie then you’ll love it. How often when working with code in SSMS that contains temp tables do you see the following message: Msg 2714, Level 16, State 6, Line 78 There is already an object named '#table' in the database. Quite often I would imagine, it happens to me all the time! Usually I write a bit of code at the top of the query window that goes and drops the table if it exists but there’s a much easier way of dealing with it. Remember that temp tables disappear as soon as your sessions ends hence wouldn’t it be nice if there were a quick way of recycling (i.e. stopping and restarting) your session? Well turns out there is and all it takes is a sequence of 4 keystrokes: Bring up the context menu using that mythically-named button that usually sits 3 to the right of the space bar ‘C’ for “Connection” ‘H’ for “Change Connection…” ‘Enter’ to select the same connection you had open last time (screenshots below) Once you’ve done it a few times you’ll probably have the whole sequence down to less than a second. Such a simple little trick, I’m annoyed with myself for it not occurring to me before! The only caveat is that you’ll need a “USE <database>” directive at the top of your query window but I don’t think that’s much of a bind! That is all other than to say if you like little SSMS titbits like this then Lee Everest’s blog is a good one to keep an eye on! @jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • C# Neural Networks with Encog

    - by JoshReuben
    Neural Networks ·       I recently read a book Introduction to Neural Networks for C# , by Jeff Heaton. http://www.amazon.com/Introduction-Neural-Networks-C-2nd/dp/1604390093/ref=sr_1_2?ie=UTF8&s=books&qid=1296821004&sr=8-2-spell. Not the 1st ANN book I've perused, but a nice revision.   ·       Artificial Neural Networks (ANNs) are a mechanism of machine learning – see http://en.wikipedia.org/wiki/Artificial_neural_network , http://en.wikipedia.org/wiki/Category:Machine_learning ·       Problems Not Suited to a Neural Network Solution- Programs that are easily written out as flowcharts consisting of well-defined steps, program logic that is unlikely to change, problems in which you must know exactly how the solution was derived. ·       Problems Suited to a Neural Network – pattern recognition, classification, series prediction, and data mining. Pattern recognition - network attempts to determine if the input data matches a pattern that it has been trained to recognize. Classification - take input samples and classify them into fuzzy groups. ·       As far as machine learning approaches go, I thing SVMs are superior (see http://en.wikipedia.org/wiki/Support_vector_machine ) - a neural network has certain disadvantages in comparison: an ANN can be overtrained, different training sets can produce non-deterministic weights and it is not possible to discern the underlying decision function of an ANN from its weight matrix – they are black box. ·       In this post, I'm not going to go into internals (believe me I know them). An autoassociative network (e.g. a Hopfield network) will echo back a pattern if it is recognized. ·       Under the hood, there is very little maths. In a nutshell - Some simple matrix operations occur during training: the input array is processed (normalized into bipolar values of 1, -1) - transposed from input column vector into a row vector, these are subject to matrix multiplication and then subtraction of the identity matrix to get a contribution matrix. The dot product is taken against the weight matrix to yield a boolean match result. For backpropogation training, a derivative function is required. In learning, hill climbing mechanisms such as Genetic Algorithms and Simulated Annealing are used to escape local minima. For unsupervised training, such as found in Self Organizing Maps used for OCR, Hebbs rule is applied. ·       The purpose of this post is not to mire you in technical and conceptual details, but to show you how to leverage neural networks via an abstraction API - Encog   Encog ·       Encog is a neural network API ·       Links to Encog: http://www.encog.org , http://www.heatonresearch.com/encog, http://www.heatonresearch.com/forum ·       Encog requires .Net 3.5 or higher – there is also a Silverlight version. Third-Party Libraries – log4net and nunit. ·       Encog supports feedforward, recurrent, self-organizing maps, radial basis function and Hopfield neural networks. ·       Encog neural networks, and related data, can be stored in .EG XML files. ·       Encog Workbench allows you to edit, train and visualize neural networks. The Encog Workbench can generate code. Synapses and layers ·       the primary building blocks - Almost every neural network will have, at a minimum, an input and output layer. In some cases, the same layer will function as both input and output layer. ·       To adapt a problem to a neural network, you must determine how to feed the problem into the input layer of a neural network, and receive the solution through the output layer of a neural network. ·       The Input Layer - For each input neuron, one double value is stored. An array is passed as input to a layer. Encog uses the interface INeuralData to hold these arrays. The class BasicNeuralData implements the INeuralData interface. Once the neural network processes the input, an INeuralData based class will be returned from the neural network's output layer. ·       convert a double array into an INeuralData object : INeuralData data = new BasicNeuralData(= new double[10]); ·       the Output Layer- The neural network outputs an array of doubles, wraped in a class based on the INeuralData interface. ·        The real power of a neural network comes from its pattern recognition capabilities. The neural network should be able to produce the desired output even if the input has been slightly distorted. ·       Hidden Layers– optional. between the input and output layers. very much a “black box”. If the structure of the hidden layer is too simple it may not learn the problem. If the structure is too complex, it will learn the problem but will be very slow to train and execute. Some neural networks have no hidden layers. The input layer may be directly connected to the output layer. Further, some neural networks have only a single layer. A single layer neural network has the single layer self-connected. ·       connections, called synapses, contain individual weight matrixes. These values are changed as the neural network learns. Constructing a Neural Network ·       the XOR operator is a frequent “first example” -the “Hello World” application for neural networks. ·       The XOR Operator- only returns true when both inputs differ. 0 XOR 0 = 0 1 XOR 0 = 1 0 XOR 1 = 1 1 XOR 1 = 0 ·       Structuring a Neural Network for XOR  - two inputs to the XOR operator and one output. ·       input: 0.0,0.0 1.0,0.0 0.0,1.0 1.0,1.0 ·       Expected output: 0.0 1.0 1.0 0.0 ·       A Perceptron - a simple feedforward neural network to learn the XOR operator. ·       Because the XOR operator has two inputs and one output, the neural network will follow suit. Additionally, the neural network will have a single hidden layer, with two neurons to help process the data. The choice for 2 neurons in the hidden layer is arbitrary, and often comes down to trial and error. ·       Neuron Diagram for the XOR Network ·       ·       The Encog workbench displays neural networks on a layer-by-layer basis. ·       Encog Layer Diagram for the XOR Network:   ·       Create a BasicNetwork - Three layers are added to this network. the FinalizeStructure method must be called to inform the network that no more layers are to be added. The call to Reset randomizes the weights in the connections between these layers. var network = new BasicNetwork(); network.AddLayer(new BasicLayer(2)); network.AddLayer(new BasicLayer(2)); network.AddLayer(new BasicLayer(1)); network.Structure.FinalizeStructure(); network.Reset(); ·       Neural networks frequently start with a random weight matrix. This provides a starting point for the training methods. These random values will be tested and refined into an acceptable solution. However, sometimes the initial random values are too far off. Sometimes it may be necessary to reset the weights again, if training is ineffective. These weights make up the long-term memory of the neural network. Additionally, some layers have threshold values that also contribute to the long-term memory of the neural network. Some neural networks also contain context layers, which give the neural network a short-term memory as well. The neural network learns by modifying these weight and threshold values. ·       Now that the neural network has been created, it must be trained. Training a Neural Network ·       construct a INeuralDataSet object - contains the input array and the expected output array (of corresponding range). Even though there is only one output value, we must still use a two-dimensional array to represent the output. public static double[][] XOR_INPUT ={ new double[2] { 0.0, 0.0 }, new double[2] { 1.0, 0.0 }, new double[2] { 0.0, 1.0 }, new double[2] { 1.0, 1.0 } };   public static double[][] XOR_IDEAL = { new double[1] { 0.0 }, new double[1] { 1.0 }, new double[1] { 1.0 }, new double[1] { 0.0 } };   INeuralDataSet trainingSet = new BasicNeuralDataSet(XOR_INPUT, XOR_IDEAL); ·       Training is the process where the neural network's weights are adjusted to better produce the expected output. Training will continue for many iterations, until the error rate of the network is below an acceptable level. Encog supports many different types of training. Resilient Propagation (RPROP) - general-purpose training algorithm. All training classes implement the ITrain interface. The RPROP algorithm is implemented by the ResilientPropagation class. Training the neural network involves calling the Iteration method on the ITrain class until the error is below a specific value. The code loops through as many iterations, or epochs, as it takes to get the error rate for the neural network to be below 1%. Once the neural network has been trained, it is ready for use. ITrain train = new ResilientPropagation(network, trainingSet);   for (int epoch=0; epoch < 10000; epoch++) { train.Iteration(); Debug.Print("Epoch #" + epoch + " Error:" + train.Error); if (train.Error > 0.01) break; } Executing a Neural Network ·       Call the Compute method on the BasicNetwork class. Console.WriteLine("Neural Network Results:"); foreach (INeuralDataPair pair in trainingSet) { INeuralData output = network.Compute(pair.Input); Console.WriteLine(pair.Input[0] + "," + pair.Input[1] + ", actual=" + output[0] + ",ideal=" + pair.Ideal[0]); } ·       The Compute method accepts an INeuralData class and also returns a INeuralData object. Neural Network Results: 0.0,0.0, actual=0.002782538818034049,ideal=0.0 1.0,0.0, actual=0.9903741937121177,ideal=1.0 0.0,1.0, actual=0.9836807956566187,ideal=1.0 1.0,1.0, actual=0.0011646072586172778,ideal=0.0 ·       the network has not been trained to give the exact results. This is normal. Because the network was trained to 1% error, each of the results will also be within generally 1% of the expected value.

    Read the article

  • Tips To Manage An Effectively Come Back To Work After A Long Vacation

    - by Gopinath
    Vacations are very relaxing – no need to reply to endless mails, no marathon meeting or conference calls. It’s all about fun during the vacation. The troubles begin as you near the end of vacation and plans to think about getting back to work. Once we are back to work after a long vacation there will be many things to worry – a pile of snail mails, hundreds of unread emails,  a flood of phone calls to answer and a stream of scheduled meetings. How to handle all the backlog and catch up quickly with the inflow of work? Here is a management tip from Harvard Business Review blog to get back to work the right way after a long vacation Block off your morning. Make sure you don’t have any meetings scheduled or big projects due. Then before you open your inbox, pause and think about your work priorities. As you make your way through emails and voicemails, focus on returning the messages that are connected to what matters most. Defer or delegate things that aren’t top priority. And remember it will probably take more than one day to get caught up, so be easy on yourself. Hope these tips lets you plan a right comeback to work after your vacation. cc Image credit: flickr/dfwcre8tive This article titled,Tips To Manage An Effectively Come Back To Work After A Long Vacation, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Windows Phone 7 ActiveSync error 86000C09 (My First Post!)

    - by Chris Heacock
    Hello fellow geeks! I'm kicking off this new blog with an issue that was a real nuisance, but was relatively easy to fix. During a recent Exchange 2003 to 2010 migration, one of the users was getting an error on his Windows Phone 7 device. The error code that popped up on the phone on every sync attempt was 86000C09 We tested the following: Different user on the same device: WORKED Problem user on a different device: FAILED   Seemed to point (conclusively) at the user's account as the crux of the issue. This error can come up if a user has too many devices syncing, but he had no other phones. We verified that using the following command: Get-ActiveSyncDeviceStatistics -Identity USERID Turns out, it was the old familiar inheritable permissions issue in Active Directory. :-/ This user was not an admin, nor had he ever been one. HOWEVER, his account was cloned from an ex-admin user, so the unchecked box stayed unchecked. We checked the box and voila, data started flowing to his device(s). Here's a refresher on enabling Inheritable permissions: Open ADUC, and enable Advanced Features: Then open properties and go to the Security tab for the user in question: Click on Advanced, and the following screen should pop up: Verify that "Include inheritable permissions from this object's parent" is *checked*.   You will notice that for certain users, this box keeps getting unchecked. This is normal behavior due to the inbuilt security of Active Directory. People that are in the following groups will have this flag altered by AD: Account Operators Administrators Backup Operators Domain Admins Domain Controllers Enterprise Admins Print Operators Read-Only Domain Controllers Replicator Schema Admins Server Operators Once the box is cheked, permissions will flow and the user will be set correctly. Even if the box is unchecked, they will function normally as they now has the proper permissions configured. You need to perform this same excercise when enabling users for Lync, but that's another blog. :-)   -Chris

    Read the article

  • SQLAuthority News – Author Visit to Nepal TechMela – 2 Technical Sessions

    - by pinaldave
    Microsoft MDP Nepal is going to organize a Tech Mela for the IT community of Nepal on March 29 & 30, 2010 (2066 Chaitra 16 & 17), Monday and Tuesday,  at the Russian Center for Science & Culture, Kamalpokhari, Kathmandu. The objective of the event is to enhance and exchange knowledge about Information Technology, as well as Microsoft products and technologies, with the IT community. I am very excited to attend this one-of-a-kind event in Nepal. I will be giving two presentations in the said event, which includes: 1) Become An Efficient Developer – Learn The Tricks of SQL Server Management Studio (SSMS) There are so many features in SQL Server Management Studio that we may not know or use all of them. This presentation will impart tricks and tips of SQL Server Management Studio to the event goers. This aims to make you an efficient developer by having an edge over the other developers. 2) Good, Bad and Ugly: The Story of Index Index is often considered as the sure shot tool of improving the performance of any query. Learn the basics with examples and discover the good, bad and ugly sides of Index. This session will help you efficiently write queries in future. I am very excited to attend this special event as this is the very first time I will be presenting in technical sessions in Nepal. If you are in Nepal, I strongly suggest that you go to this once-in a lifetime IT fair. Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL, Technology

    Read the article

  • Partner Webcast – More out of Database Appliance with DB Options - 13 September 2012

    - by Thanos
    The Oracle Database Appliance is a new way to take advantage of the world's most popular database—Oracle Database 11g —in a single, easy-to-deploy and manage system. It's a complete package of software, server, storage, and networking that's engineered for simplicity; saving time and money by simplifying deployment, maintenance, and support of database workloads. But that is not all, with the support for all Oracle Database Options, Oracle Database Appliance can be the ideal solution for many use cases. Feature Benefit Simplifies deployment, maintenance, and support of high-availability database workloads Saves significant time and effort throughout the database administration lifecycle An engineered system of software, server, storage, and networking High availability for a wide range of custom and packaged OLTP and data warehousing application databases Simple one-button Installation, full-stack integrated patching and diagnostics Reduces planned and unplanned downtime by automatically monitoring and logging service requests with Oracle Support Built using the world’s #1 database Protects databases from server and storage failures with Oracle Real Application Clusters and Automatic Storage Management Unique Pay-As-You-Grow software licensing Reduces cost with flexibility to adjust your software spend as your business grows without the need for any hardware upgrades Discover the Oracle Database Appliance Value Proposition and learn how to position and combine it with database options to capture new business and easily roll out solutions safely and with maximum cost efficiency. This webcast is repeated once again for your benefit. Agenda: Oracle Database& Engineered Systems Innovation. What’s the Oracle Database Appliance ? Oracle Database Appliance Value Proposition. Oracle Database Appliance with Database Options Oracle Database Appliance Partners Business Delivery FormatThis FREE online LIVE eSeminar will be delivered over the Web. Registrations received less than 24hours prior to start time may not receive confirmation to attend. Duration: 1 hour Register Now! Oracle Database Appliance is available for purchase at the Oracle Store under Engineered Systems. For any questions please contact us at partner.imc-AT-beehiveonline.oracle-DOT-com Visit regularly our ISV Migration Center blog Or Follow us @oracleimc to learn more on Oracle Technologies as well as upcoming partner webcasts and events.

    Read the article

  • Google tweets – Now search twitter archives using Google

    - by samsudeen
    Google has launched a Twitter archive service which allows you to  search tweets in real time as well as on its huge public archive (remember Twitter crossed 10 billionth tweet last month). The search results are displayed as tweets with twitter logo. To explore the twitter search go to Google.com homepage  and select   “Show options” on the search results page, then select “Updates.”.  The search is similar to the Google search with options to dig through the tweets by timeframe. You can explore results by zooming through a particular time range  or date. In addition to the time chart, it also displays the relative volume of an activity on Twitter about the topic. as you can see there is a spike about GSLV launch after 3 PM today.There is also a short cut link “Now” on the left corner which displays the latest results on the topics searched.The tweets also gets refreshed automatically.   Considering the huge volume of activity (50 million messages per day) on twitter, the archive is going to more and bigger. By providing such feature Google has once again proved it is way ahead of others in search Related Posts:None FoundJoin us on Facebook to read all our stories right inside your Facebook news feed.

    Read the article

  • Google tweets – Now search twitter archives using Google

    - by samsudeen
    Google has launched a Twitter archive service which allows you to  search tweets in real time as well as on its huge public archive (remember Twitter crossed 10 billionth tweet last month). The search results are displayed as tweets with twitter logo. To explore the twitter search go to Google.com homepage  and select   “Show options” on the search results page, then select “Updates.”.  The search is similar to the Google search with options to dig through the tweets by timeframe. You can explore results by zooming through a particular time range  or date. In addition to the time chart, it also displays the relative volume of an activity on Twitter about the topic. as you can see there is a spike about GSLV launch after 3 PM today.There is also a short cut link “Now” on the left corner which displays the latest results on the topics searched.The tweets also gets refreshed automatically.   Considering the huge volume of activity (50 million messages per day) on twitter, the archive is going to more and bigger. By providing such feature Google has once again proved it is way ahead of others in search Related Posts:None FoundJoin us on Facebook to read all our stories right inside your Facebook news feed.

    Read the article

  • Save GMail attachments directly to Google Drive

    - by Gopinath
    What makes Google Drive attractive is it ability to play nice with other Google offerings like Google Docs, GMail, Android mobiles & tablets. Google Drive is well integrated with Google Docs and every document you create is automatically saved on Google Drive. Google Drive’s integration with other services like GMail may be in progress at Google, but enthusiastic community developers has released a plugin to integrate with GMail. The Google Chrome plugin “Gmail Attachments To Drive” lets you automatically save GMail attachments to Google Drive with a single click. Once the plugin is installed it adds a link “Save To Drive” next to each attachment displayed on GMail and on clicking, it automatically saves the files to Google Drive. I tested the plugin by saving attachments like PDFs, MS Word documents and Images to Google Drive and it worked very well.  I don’t have any complaints on the plugin except couple of feature requests. If the plugin can provide option “Save All Files To Drive” it will be very much helpful to save all attachment of an email at one shot. Also it would be great if the developer can extend it to Firefox too. Anyways it’s a great plugin for Google Drive users and worth checking it out. Install Gmail Attachments To Drive for Google Chrome

    Read the article

  • SQLAuthority Book Review – DBA Survivor: Become a Rock Star DBA

    - by pinaldave
    DBA Survivor: Become a Rock Star DBA – Thomas LaRock Link to Amazon Link to Flipkart First of all, I thank all my readers when I wrote that I could not get this book in any local book stores, because they offered me to send a copy of this good book. A very special mention goes to Sripada and Jayesh for they gave so much effort in finding my home address and sending me the hard copy. Before, I did not have the copy of the book, but now I have two of it already! It surprises me how my readers were able to find my home address, which I have not publicly shared. Quick Review: This is indeed a one easy-to-read and fun book. We all work day and night with technology yet we should not forget to show our love and care for our family at home. For our souls that starve for peace and guidance, this one book is the “it” book for all the technology enthusiasts. Though this book was specifically written for DBAs, the reach is not limited to DBAs only because the lessons incorporated in it actually applies to all. This is one of the most motivating technical books I have read. Detailed Review: Let us go over a few questions first: Who wants to be as famous as rockstars in the field of Database Administration? How can one learn what it takes to become a top notch software developer? If you are a beginner in your field, how will you go to next level? Your boss may be very kind or like Dilbert’s Boss, what will you do? How do you keep growing when Eco-system around you does not support you? You are almost at top but there is someone else at the TOP, what do you do and how do you avoid office politics? As a database developer what should be your basic responsibility? and many more… I was able to completely read book in one sitting and I loved it. Before I continue with my opinion, I want to echo the opinion of Kevin Kline who has written the Forward of the book. He has truly suggested that “You hold in your hands a collection of insights and wisdom on the topic of database administration gained through many years of hard-won experience, long nights of study, and direct mentorship under some of the industry’s most talented database professionals and information technology (IT) experts.” Today, IT field is getting bigger and better, while talking about terabytes of the database becomes “more” normal every single day. The gods and demigods of database professionals are taking care of these large scale databases and are carefully maintaining them. In this world, there are only a few beginnings on the first step. There are many experts in different technology fields who are asked to address the issues with databases. There is YOU and ME, who is just new to this work. So we ask ourselves WHERE to begin and HOW to begin. We adore and follow the religion of our rockstars, but oftentimes we really have no idea about their background and their struggles. Every rockstar has his success story which needs to be digested before learning his tricks and tips. This book starts with the same note and teaches the two most important lessons for anybody who wants to be a DBA Rockstar –  to focus on their single goal of learning and to excel the technology. The story starts with three simple guidelines – Get Prepared, Get Trained, Get Certified. Once a person learns the skills, and then, it would be about time that he needs to enrich or to improve those skills you have learned. I am sure that the right opportunity will come finding themselves and they will not have to go run behind it. However, the real challenge for any person is the first day or first week. A new employee, no matter how much experienced he is, sometimes has no clue about what should one do at new job. Chapter 2 and chapter 3 precisely talk about what one should do as soon as the new job begins. It is also written with keeping the fact in focus that each job can be very much different but there are few infrastructure setups and programming concepts are the same. Learning basics of database was really interesting. I like to focus on the roots of any technology. It is important to understand the structure of the database before suggesting what indexes needs to be created, the same way this book covers the most essential knowledge one must learn by most database developers. I think the title of the fourth chapter is my favorite sentence in this book. I can see that I will be saying this again and again in the future – “A Development Server Is a Production Server to a Developer“. I have worked in the software industry for almost 8 years now and I have seen so many developers sitting on their chairs and waiting for instructions from their lead about how to improve the code or what to do the next. When I talk to them, I suggest that the experiment with their server and try various techniques. I think they all should understand that for them, a development server is their production server and needs to pay proper attention to the code from the beginning. There should be NO any inappropriate code from the beginning. One has to fully focus and give their best, if they are not sure they should ask but should do something and stay active. Chapter 5 and 6 talks about two essential skills for any developer and database administration – what are the ethics of developers when they are working with production server and how to support software which is running on the production server. I have met many people who know the theory by heart but when put in front of keyboard they do not know where to start. The first thing they do opening the browser and searching online, instead of opening SQL Server Management Studio. This can very well happen to anybody who is experienced as well. Chapter 5 and 6 addresses that situation as well includes the handy scripts which can solve almost all the basic trouble shooting issues. “Where’s the Buffet?” By far, this is the best chapter in this book. If you have ever met me, you would know that I love food. I think after reading this chapter, I felt Thomas has written this just keeping me in mind. I think there will be many other people who feel the same way, too. Even my wife who read this chapter thought this was specifically written for me. I will not talk any more about this chapter as this is one must read chapter. And of course this is about real ‘FOOD‘. I am an SQL Server Trainer and Consultant and I totally agree with the point made in the chapter 8 of this book. Yes, it says here that what is necessary to train employees and people. Millions of dollars worth the labor is continuously done in the world which has faults and incorrect. Once something goes wrong, very expensive consultant comes in and fixes the problem. This whole cycle which can be stopped and improved if proper training is done. There is plenty of free trainings available as well, if one cannot afford paid training. “Connect. Learn. Share” – I think this is a great summary and bird’s eye view of this book. Networking is the key. Everything which is discussed in this book can be taken to next level if one properly uses this tips and continuously grow with it. Connecting with others, helping learn each other and building the good knowledge sharing environment should be the goal of everyone. Before I end the review I want to share a real experience. I have personally met one DBA who has worked in a single department in a company for so long that when he was put in a different department in his company due to closing that department, he could not adjust and quit the job despite the same people and company around him. Adjusting in the new environment gets much tougher as one person gets more and more experienced. This book precisely addresses the same issue along with their solutions. I just cannot stop comparing the book with my personal journey. I found so many things which are coincidently in the book is written as how we developer and DBA think. I must express special thanks to Thomas for taking time in his personal life and write this book for us. This book is indeed a book for everybody who wants to grow healthy in the tough and competitive environment. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Book Review, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • Cannot install Windows 7 on new PC (BSOD during installation)

    - by andronz
    Yesterday I had received new hardware and tried to assemble all of it. But when I was installing Windows 7 x64 for two times I got a BSOD, and a error message like this: Your PC has rebooted unexpectly, installation will continue after reboot. I have checked everything and tried to install once more time. It was almost successful, Windows had installed, but I cannot format my disk, open cmd or control panel, boot from DVD etc. It's the first time I have assembled a PC, and I don't know where the problem is, in my hardware or in Windows. My components: MSI z77a-g43 Intel i5 3570k Seagate Barracuda (plug into sata3 slot 6 Gb, there I have also sata 3 Gb slot) Corsair XMS3 (4x4Gb) HIS IceQ Also in the BIOS, I have changed RAM frequency to DDR3 1600 and voltage to 1.65v

    Read the article

  • IUSR account and SCCM 2007 R3 agent

    - by steve schofield
    I recently started working with SCCM and rolling the agent out with machine having IIS 7.x installed.  I ran into issues where the SCCM agent wouldn't install.  The errors mostly were 0x80004005 and 1603, another key one I found was Return Value 3 in the SCCM setup log.  During the troubleshooting, I found a cool utility called WMI Diag  WMI diag is a VBS script that reads the local WMI store and helps diagnose issue.  Anyone working with SMS or SCCM should keep this handy tool around.  The good thing my particular case WMI was healthy.  The issue turned out I changed the Anonymous Authentication module from using the IUSR account to inherit Application Pool identity.  Once we temporarily switched back to IUSR, installed the agent, then switched the setting back to inherit application pool identity, the SCCM agent installed with no issues. I'm not sure why switching back to the IUSR account solved my issue, if I find out I'll update the post.  More information on IIS 7 builtin accounts http://learn.iis.net/page.aspx/140/understanding-built-in-user-and-group-accounts-in-iis-7 Specify an application pool identity  http://technet.microsoft.com/en-us/library/cc771170(WS.10).aspx SCCM resources (Config Mgr Setup  / Deployment forums) http://social.technet.microsoft.com/Forums/en-US/configmgrsetup/threads http://www.myitforum.com (the best independent SCCM community resource) Hope this helps. Steve SchofieldMicrosoft MVP - IIS

    Read the article

  • Ask How-To Geek: iPad Battery Life, Batch Resizing Photos, and Syncing Massive Music Collections

    - by Jason Fitzpatrick
    Christmas was good to many of you and now you’ve got all sorts of tech questions related to your holiday spoils. Come on in and we’ll clear up how to squeeze more life out of your iPad, resize all those photos, and sync massive music collections to mobile devices. Once a week we dip into our reader mailbag and help readers solve their problems, sharing the useful solutions with you in the process. Read on to see our fixes for this week’s reader dilemmas. Latest Features How-To Geek ETC How to Use the Avira Rescue CD to Clean Your Infected PC The Complete List of iPad Tips, Tricks, and Tutorials Is Your Desktop Printer More Expensive Than Printing Services? 20 OS X Keyboard Shortcuts You Might Not Know HTG Explains: Which Linux File System Should You Choose? HTG Explains: Why Does Photo Paper Improve Print Quality? Orbiting at the Edge of the Atmosphere Wallpaper Simon’s Cat Explores the Christmas Tree! [Video] The Outdoor Lights Scene from National Lampoon’s Christmas Vacation [Video] The Famous Home Alone Pizza Delivery Scene [Classic Video] Chronicles of Narnia: The Voyage of the Dawn Treader Theme for Windows 7 Cardinal and Rabbit Sharing a Tree on a Cold Winter Morning Wallpaper

    Read the article

  • LINQ – TakeWhile and SkipWhile methods

    - by nmarun
    I happened to read about these methods on Vikram's blog and tried testing it. Somehow when I saw the output, things did not seem to add up right. I’m writing this blog to show the actual workings of these methods. Let’s take the same example as showing in Vikram’s blog and I’ll build around it. 1: int[] numbers = { 5, 4, 1, 3, 9, 8, 6, 7, 2, 0 }; 2:  3: foreach(var number in numbers.TakeWhile(n => n < 7)) 4: { 5: Console.WriteLine(number); 6: } Now, the way I (incorrectly) read the upper bound condition in the foreach loop was: ‘Give me all numbers that pass the condition of n<7’. So I was expecting the answer to be: 5, 4, 1, 3, 2, 0. But when I run the application, I see only: 5, 4, 1,3. Turns out I was wrong (happens at least once a day). The documentation on the method says ‘Returns elements from a sequence as long as a specified condition is true. To show in code, my interpretation was the below code’: 1: foreach (var number in numbers) 2: { 3: if (number < 7) 4: { 5: Console.WriteLine(number); 6: } 7: } But the actual implementation is: 1: foreach(var number in numbers) 2: { 3: if(number < 7) 4: { 5: Console.WriteLine(number); 6: break; 7: } 8: } So there it is, another situation where one simple word makes a difference of a whole world. The SkipWhile method has been implemented in a similar way – ‘Bypasses elements in a sequence as long as a specified condition is true and then returns the remaining elements’ and not ‘Bypasses elements in a sequence where a specified condition is true and then returns the remaining elements’. (Subtle.. very very subtle). It’s feels strange saying this, but hope very few require to read this article to understand these methods.

    Read the article

  • System can't sleep, can't shutdown (might be networking related)

    - by Kevin Q
    I speculate it's networking related since I could only shut down my system if I do /etc/init.d/networking stop first. If I tried to put the system to sleep, it disconnect the network then reconnect it automatically, never goes to sleep. I uninstalled the network manager, then /etc/init.d/networking stop will not help shutdown the system anymore. When I boot into the OS, there were some warning about Unknown job:S20acpid,S20network-interface, S20network-manager . I got some advices to do update-rc.d -f to remove all those script, which I am not certain if it's right thing to do. The messages are gone but the problem remains. When I tried to restart, I used to get this message It doesn't seem to be the modem manager since I later removed it. Once upon a time, the fronze shutting down/restarting can be restarted when I presses Ctrl+Alt+Delete, and it says something like networking is stopped externally. If I log in single user mode, and shutdown or reboot, it show this screen I also tried GRUB_CMDLINE_LINUX_DEFAULT = "acpi=force" This is a Ubuntu 12.04 running on Thinkpad T520, not new installation, no such problems occurred before.

    Read the article

  • Heaps of Trouble?

    - by Paul White NZ
    If you’re not already a regular reader of Brad Schulz’s blog, you’re missing out on some great material.  In his latest entry, he is tasked with optimizing a query run against tables that have no indexes at all.  The problem is, predictably, that performance is not very good.  The catch is that we are not allowed to create any indexes (or even new statistics) as part of our optimization efforts. In this post, I’m going to look at the problem from a slightly different angle, and present an alternative solution to the one Brad found.  Inevitably, there’s going to be some overlap between our entries, and while you don’t necessarily need to read Brad’s post before this one, I do strongly recommend that you read it at some stage; he covers some important points that I won’t cover again here. The Example We’ll use data from the AdventureWorks database, copied to temporary unindexed tables.  A script to create these structures is shown below: CREATE TABLE #Custs ( CustomerID INTEGER NOT NULL, TerritoryID INTEGER NULL, CustomerType NCHAR(1) COLLATE SQL_Latin1_General_CP1_CI_AI NOT NULL, ); GO CREATE TABLE #Prods ( ProductMainID INTEGER NOT NULL, ProductSubID INTEGER NOT NULL, ProductSubSubID INTEGER NOT NULL, Name NVARCHAR(50) COLLATE SQL_Latin1_General_CP1_CI_AI NOT NULL, ); GO CREATE TABLE #OrdHeader ( SalesOrderID INTEGER NOT NULL, OrderDate DATETIME NOT NULL, SalesOrderNumber NVARCHAR(25) COLLATE SQL_Latin1_General_CP1_CI_AI NOT NULL, CustomerID INTEGER NOT NULL, ); GO CREATE TABLE #OrdDetail ( SalesOrderID INTEGER NOT NULL, OrderQty SMALLINT NOT NULL, LineTotal NUMERIC(38,6) NOT NULL, ProductMainID INTEGER NOT NULL, ProductSubID INTEGER NOT NULL, ProductSubSubID INTEGER NOT NULL, ); GO INSERT #Custs ( CustomerID, TerritoryID, CustomerType ) SELECT C.CustomerID, C.TerritoryID, C.CustomerType FROM AdventureWorks.Sales.Customer C WITH (TABLOCK); GO INSERT #Prods ( ProductMainID, ProductSubID, ProductSubSubID, Name ) SELECT P.ProductID, P.ProductID, P.ProductID, P.Name FROM AdventureWorks.Production.Product P WITH (TABLOCK); GO INSERT #OrdHeader ( SalesOrderID, OrderDate, SalesOrderNumber, CustomerID ) SELECT H.SalesOrderID, H.OrderDate, H.SalesOrderNumber, H.CustomerID FROM AdventureWorks.Sales.SalesOrderHeader H WITH (TABLOCK); GO INSERT #OrdDetail ( SalesOrderID, OrderQty, LineTotal, ProductMainID, ProductSubID, ProductSubSubID ) SELECT D.SalesOrderID, D.OrderQty, D.LineTotal, D.ProductID, D.ProductID, D.ProductID FROM AdventureWorks.Sales.SalesOrderDetail D WITH (TABLOCK); The query itself is a simple join of the four tables: SELECT P.ProductMainID AS PID, P.Name, D.OrderQty, H.SalesOrderNumber, H.OrderDate, C.TerritoryID FROM #Prods P JOIN #OrdDetail D ON P.ProductMainID = D.ProductMainID AND P.ProductSubID = D.ProductSubID AND P.ProductSubSubID = D.ProductSubSubID JOIN #OrdHeader H ON D.SalesOrderID = H.SalesOrderID JOIN #Custs C ON H.CustomerID = C.CustomerID ORDER BY P.ProductMainID ASC OPTION (RECOMPILE, MAXDOP 1); Remember that these tables have no indexes at all, and only the single-column sampled statistics SQL Server automatically creates (assuming default settings).  The estimated query plan produced for the test query looks like this (click to enlarge): The Problem The problem here is one of cardinality estimation – the number of rows SQL Server expects to find at each step of the plan.  The lack of indexes and useful statistical information means that SQL Server does not have the information it needs to make a good estimate.  Every join in the plan shown above estimates that it will produce just a single row as output.  Brad covers the factors that lead to the low estimates in his post. In reality, the join between the #Prods and #OrdDetail tables will produce 121,317 rows.  It should not surprise you that this has rather dire consequences for the remainder of the query plan.  In particular, it makes a nonsense of the optimizer’s decision to use Nested Loops to join to the two remaining tables.  Instead of scanning the #OrdHeader and #Custs tables once (as it expected), it has to perform 121,317 full scans of each.  The query takes somewhere in the region of twenty minutes to run to completion on my development machine. A Solution At this point, you may be thinking the same thing I was: if we really are stuck with no indexes, the best we can do is to use hash joins everywhere. We can force the exclusive use of hash joins in several ways, the two most common being join and query hints.  A join hint means writing the query using the INNER HASH JOIN syntax; using a query hint involves adding OPTION (HASH JOIN) at the bottom of the query.  The difference is that using join hints also forces the order of the join, whereas the query hint gives the optimizer freedom to reorder the joins at its discretion. Adding the OPTION (HASH JOIN) hint results in this estimated plan: That produces the correct output in around seven seconds, which is quite an improvement!  As a purely practical matter, and given the rigid rules of the environment we find ourselves in, we might leave things there.  (We can improve the hashing solution a bit – I’ll come back to that later on). Faster Nested Loops It might surprise you to hear that we can beat the performance of the hash join solution shown above using nested loops joins exclusively, and without breaking the rules we have been set. The key to this part is to realize that a condition like (A = B) can be expressed as (A <= B) AND (A >= B).  Armed with this tremendous new insight, we can rewrite the join predicates like so: SELECT P.ProductMainID AS PID, P.Name, D.OrderQty, H.SalesOrderNumber, H.OrderDate, C.TerritoryID FROM #OrdDetail D JOIN #OrdHeader H ON D.SalesOrderID >= H.SalesOrderID AND D.SalesOrderID <= H.SalesOrderID JOIN #Custs C ON H.CustomerID >= C.CustomerID AND H.CustomerID <= C.CustomerID JOIN #Prods P ON P.ProductMainID >= D.ProductMainID AND P.ProductMainID <= D.ProductMainID AND P.ProductSubID = D.ProductSubID AND P.ProductSubSubID = D.ProductSubSubID ORDER BY D.ProductMainID OPTION (RECOMPILE, LOOP JOIN, MAXDOP 1, FORCE ORDER); I’ve also added LOOP JOIN and FORCE ORDER query hints to ensure that only nested loops joins are used, and that the tables are joined in the order they appear.  The new estimated execution plan is: This new query runs in under 2 seconds. Why Is It Faster? The main reason for the improvement is the appearance of the eager Index Spools, which are also known as index-on-the-fly spools.  If you read my Inside The Optimiser series you might be interested to know that the rule responsible is called JoinToIndexOnTheFly. An eager index spool consumes all rows from the table it sits above, and builds a index suitable for the join to seek on.  Taking the index spool above the #Custs table as an example, it reads all the CustomerID and TerritoryID values with a single scan of the table, and builds an index keyed on CustomerID.  The term ‘eager’ means that the spool consumes all of its input rows when it starts up.  The index is built in a work table in tempdb, has no associated statistics, and only exists until the query finishes executing. The result is that each unindexed table is only scanned once, and just for the columns necessary to build the temporary index.  From that point on, every execution of the inner side of the join is answered by a seek on the temporary index – not the base table. A second optimization is that the sort on ProductMainID (required by the ORDER BY clause) is performed early, on just the rows coming from the #OrdDetail table.  The optimizer has a good estimate for the number of rows it needs to sort at that stage – it is just the cardinality of the table itself.  The accuracy of the estimate there is important because it helps determine the memory grant given to the sort operation.  Nested loops join preserves the order of rows on its outer input, so sorting early is safe.  (Hash joins do not preserve order in this way, of course). The extra lazy spool on the #Prods branch is a further optimization that avoids executing the seek on the temporary index if the value being joined (the ‘outer reference’) hasn’t changed from the last row received on the outer input.  It takes advantage of the fact that rows are still sorted on ProductMainID, so if duplicates exist, they will arrive at the join operator one after the other. The optimizer is quite conservative about introducing index spools into a plan, because creating and dropping a temporary index is a relatively expensive operation.  It’s presence in a plan is often an indication that a useful index is missing. I want to stress that I rewrote the query in this way primarily as an educational exercise – I can’t imagine having to do something so horrible to a production system. Improving the Hash Join I promised I would return to the solution that uses hash joins.  You might be puzzled that SQL Server can create three new indexes (and perform all those nested loops iterations) faster than it can perform three hash joins.  The answer, again, is down to the poor information available to the optimizer.  Let’s look at the hash join plan again: Two of the hash joins have single-row estimates on their build inputs.  SQL Server fixes the amount of memory available for the hash table based on this cardinality estimate, so at run time the hash join very quickly runs out of memory. This results in the join spilling hash buckets to disk, and any rows from the probe input that hash to the spilled buckets also get written to disk.  The join process then continues, and may again run out of memory.  This is a recursive process, which may eventually result in SQL Server resorting to a bailout join algorithm, which is guaranteed to complete eventually, but may be very slow.  The data sizes in the example tables are not large enough to force a hash bailout, but it does result in multiple levels of hash recursion.  You can see this for yourself by tracing the Hash Warning event using the Profiler tool. The final sort in the plan also suffers from a similar problem: it receives very little memory and has to perform multiple sort passes, saving intermediate runs to disk (the Sort Warnings Profiler event can be used to confirm this).  Notice also that because hash joins don’t preserve sort order, the sort cannot be pushed down the plan toward the #OrdDetail table, as in the nested loops plan. Ok, so now we understand the problems, what can we do to fix it?  We can address the hash spilling by forcing a different order for the joins: SELECT P.ProductMainID AS PID, P.Name, D.OrderQty, H.SalesOrderNumber, H.OrderDate, C.TerritoryID FROM #Prods P JOIN #Custs C JOIN #OrdHeader H ON H.CustomerID = C.CustomerID JOIN #OrdDetail D ON D.SalesOrderID = H.SalesOrderID ON P.ProductMainID = D.ProductMainID AND P.ProductSubID = D.ProductSubID AND P.ProductSubSubID = D.ProductSubSubID ORDER BY D.ProductMainID OPTION (MAXDOP 1, HASH JOIN, FORCE ORDER); With this plan, each of the inputs to the hash joins has a good estimate, and no hash recursion occurs.  The final sort still suffers from the one-row estimate problem, and we get a single-pass sort warning as it writes rows to disk.  Even so, the query runs to completion in three or four seconds.  That’s around half the time of the previous hashing solution, but still not as fast as the nested loops trickery. Final Thoughts SQL Server’s optimizer makes cost-based decisions, so it is vital to provide it with accurate information.  We can’t really blame the performance problems highlighted here on anything other than the decision to use completely unindexed tables, and not to allow the creation of additional statistics. I should probably stress that the nested loops solution shown above is not one I would normally contemplate in the real world.  It’s there primarily for its educational and entertainment value.  I might perhaps use it to demonstrate to the sceptical that SQL Server itself is crying out for an index. Be sure to read Brad’s original post for more details.  My grateful thanks to him for granting permission to reuse some of his material. Paul White Email: [email protected] Twitter: @PaulWhiteNZ

    Read the article

  • Thread Synchronization and Synchronization Primitives

    When considering synchronization in an application, the decision truly depends on what the application and its worker threads are going to do. I would use synchronization if two or more threads could possibly manipulate the same instance of an object at the same time. An example of this in C# can be demonstrated through the use of storing data in a static object. A static object is initialized once per application and the data within the object can be accessed by all threads. I would use the synchronization primitives to prevent any data from being manipulated by multiple threads simultaneously. This would reduce any data corruption from occurring within the object. On the other hand if all the threads used non static objects and were independent of the other tasks there would be no need to use synchronization. Synchronization Primitives in C#: Basic Blocking Locking Signaling Non-Blocking Synchronization Constructs The Basic Blocking methods include Sleep, Join, and Task.Wait.  These methods force threads to wait until other threads have completed. In addition, these methods can also force a thread to wait a set amount of time before continuing to work.   The Locking primitive prevents a thread from entering a critical section of code while another thread is in the same critical section.  If another thread attempts to enter a locked code, it will wait, until the code block is released. The Signaling primitive allows a thread to temporarily pause work until receiving a notification from another thread that it is ok to continue working. The Signaling primitive removes the need for polling.The Non-Blocking Synchronization Constructs protect access to a common field by calling upon processor primitives.

    Read the article

  • Do i need a full time seo employee? [closed]

    - by user481913
    I need seo done for 1 site that's still new. It's a niche listings/ecommerce site. And i need to make a hiring decision for this - whether there's a need for a full time employee dedicated to seo or is part time or a contract sufficient? Here's some info and assumptions : 1)The site's dynamic - so perhaps all ON PAGE SEO - keyword research, page title, meta tags etc. could be built in programatically and is perhaps ONE TIME EFFORT. 2)Most(perhaps not all) ON PAGE SEO is taken care at the start or initially so doesn't need much time devotion later. 3)Most ON PAGE SEO for a DYNAMIC site is a programmer's job(as probably a seo employee doesn't understand programming) with some assistance from a seo employee for KEYWORD RESEARCH etc. So once built into the software, it DOESN'T NEED much effort on part of the seo employee in the later stages. 4)OFF PAGE SEO IS really where the seo employee would really spend most of his/her time - like build some links, write articles/blogs, directory submissions etc. So considering that there's just 1 site and that most effort for the seo employee is concenterated on OFF PAGE SEO, do i really need to hire someone Full Time? You're most welcome to add your own views and perspectives to this. It might help someone else as well in the future in their hiring decision.

    Read the article

  • SQLAuthority News – Best Complements – DBA Survivor: Become a Rock Star DBA

    - by pinaldave
    Today’s blog post is about the biggest complement I have ever received. I am very very happy and would like to share my feelings with you. Thomas Larock (Blog | Twitter) (known as SQLRockstar) keeps the excellent ranking of the blogger in SQL Server Arena. I am big fan of this list and have been referring lots of people. I was in the msdb database since the very first day of the ranking. Two days ago, I noticed that I am promoted to Model Database. I often receive complements about my blog but this is the biggest complement I have ever received. I have taken the snapshot of the same and listed here. Thomas is a SQL Server MVP and Board of Directors for the Professional Association for SQL Server. He is the author of DBA Survivor: Become a Rock Star DBA. The book is designed to give a junior to mid-level DBA a better understanding of what skills are needed in order to survive (and thrive) in their career. This book is currently not available in India, I am waiting for someone traveling from USA to bring the copy for me. I am very eager to read the book and promise to share the book review with all of you once I am done reading. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • Don’t be a dinosaur. Use Calendar Tree!

    - by jamiet
    If one spends long enough in my company one will likely eventually have to listen to me bark on about subscribable calendars. I was banging on about them way back in 2009, I’ve cajoled SQLBits into providing one, provided one myself for the World Cup, and opined that they could be transformative for the delivery of BI. I believe subscribable calendars can change the world but have never been good at elucidating why I thought so, for that reason I always direct people to read a blog by Scott Adams (yes, the guy who draws Dilbert) entitled Calendar as Filter. In that blog post Scott writes: I think the family calendar is the organizing principle into which all external information should flow. I want the kids' school schedules for sports and plays and even lunch choices to automatically flow into the home calendar. Everything you do has a time dimension. If you are looking for a new home, the open houses are on certain dates, and certain houses that fit your needs are open at certain times. If you are shopping for some particular good, you often need to know the store hours. Your calendar needs to know your shopping list and preferences so it can suggest good times to do certain things I think the biggest software revolution of the future is that the calendar will be the organizing filter for most of the information flowing into your life. You think you are bombarded with too much information every day, but in reality it is just the timing of the information that is wrong. Once the calendar becomes the organizing paradigm and filter, it won't seem as if there is so much. I wholly agree and hence was delighted to discover (via the Hanselminutes podcast) that Scott has a startup called CalendarTree.com whose raison d’etre is to solve this very problem. What better way to describe a Scott Adams startup than with a Scott Adams comic: I implore you to check out Calendar Tree and make the world a tiny bit better by using it to share any information that has a time dimension to it. Don’t be a dinosaur, use Calendar tree! @Jamiet

    Read the article

  • Visual Studio RTM, Silverlight 4 RTM and WCF RIA Services download links

    - by Harish Ranganathan
    Its been a long time since I blogged.  Primarily due to Tech Ed India, the ongoing Great Indian Developer Summit (GIDS 2010) and the related travels.  However, here is a quick post with a few updates.  Visual Studio 2010 RTMed in India during Tech Ed.  We had the privilege of having Soma our Senior VP launch VS 2010 RTM in Bangalore, India, during Tech Ed India 2010.   With that we also had Silverlight 4 getting RTMed during the same week. Earlier I had written posts around using the VS 2010 Beta, RC and the corresponding Silverlight, WCF RIA bits etc., and getting them all to work together.  Now that, both VS 2010 and Silverlight have RTMed, I wanted to post a quick update on the necessary downloads. Visual Studio 2010 RTM can be downloaded from MSDN Visual Studio site  If you are doing Silverlight 4 development with Visual studio, then you can download the Silverlight 4 Tools RC2 for Visual Studio  Then, if you are developing with WCF RIA Services, you can download the WCF RIA Services RC 2 for SL4 and VS 2010 And finally, if you want to use WCF RIA Services in ASP.NET you would require the Domain DataSource control.  Also, to use some of the additional Service Utility tools, you would require the WCF RIA Services Toolkit.  You can download the same from WCF RIA Services Toolkit April 2010 Once you have installed all the above, you should be able to see the following in your add-remove programs WCF RIA Services v1.0 for Visual Studio 2010 (Version 4.0.50401.0) WCF RIA Services Toolkit (Version 4.0.50401.0) Microsoft Silverlight (Version 4.0.50401.0) Microsoft Silverlight 4 SDK (Version 4.0.50401.0) Also, you would need the Expression Blend 4 for designing the apps for Silverlight 4.  You can download the release candidate from here Thats it.  You are all set for development with Visual Studio 2010 and Silverlight 4, WCF RIA Services. Cheers !!!

    Read the article

  • How do I draw an OpenGL point sprite using libgdx for Android?

    - by nbolton
    Here's a few snippets of what I have so far... void create() { renderer = new ImmediateModeRenderer(); tiles = Gdx.graphics.newTexture( Gdx.files.getFileHandle("res/tiles2.png", FileType.Internal), TextureFilter.MipMap, TextureFilter.Linear, TextureWrap.ClampToEdge, TextureWrap.ClampToEdge); } void render() { Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT); Gdx.gl.glClearColor(0.6f, 0.7f, 0.9f, 1); } void renderSprite() { int handle = tiles.getTextureObjectHandle(); Gdx.gl.glBindTexture(GL.GL_TEXTURE_2D, handle); Gdx.gl.glEnable(GL.GL_POINT_SPRITE); Gdx.gl11.glTexEnvi(GL.GL_POINT_SPRITE, GL.GL_COORD_REPLACE, GL.GL_TRUE); renderer.begin(GL.GL_POINTS); renderer.vertex(pos.x, pos.y, pos.z); renderer.end(); } create() is called once when the program starts, and renderSprites() is called for each sprite (so, pos is unique to each sprite) where the sprites are arranged in a sort-of 3D cube. Unfortunately though, this just renders a few white dots... I suppose that the texture isn't being bound which is why I'm getting white dots. Also, when I draw my sprites on anything other than 0 z-axis, they do not appear -- I read that I need to crease my zfar and znear, but I have no idea how to do this using libgdx (perhaps it's because I'm using ortho projection? What do I use instead?). I know that the texture is usable, since I was able to render it using a SpriteBatch, but I guess I'm not using it properly with OpenGL.

    Read the article

  • Getting Started with Employee Info Starter Kit (v4.0.0)

    - by joycsharp
    The new release of Employee Info Starter Kit contains lots of exciting features available in Visual Studio 2010 and .NET 4.0. To get started with the new version, you will need less than 5 minutes. Minimum System Requirements Before getting started, please make sure you have installed Visual Studio 2010 RC (or higher) and Sql Server 2005 Express edition (or higher installed on your machine. Running the Starter Kit for First Time 1. Download the starter kit 4.0.0 version form here and extract it. 2. Go to <extraction folder>\Source\Eisk.Solution and click the solution file 3. From the solution explorer, right click the “Eisk.Web” web site project node and select “Set as Startup Project” and hit Ctrl + F5   4. You will be prompted to install database, just follow the instruction. That’s it! You are ready to use this starter kit. Running the Tests Employee Info Starter Kit contains a infrastructure for Integration and Unit Testing, by utilizing cool test tools in Visual Studio 2010. Once you complete the steps, mentioned above, take a minute to run the test cases on the fly. 1. From the solution explorer, to go “Solution Items\e-i-s-k-2010.vsmdi” and click it. You will see the available Tests in the Visual Studio Test Lists. Select all, except the “Load Tests” node (since Load Tests takes a bit time) 2. Click “Run Checked Tests” control from the upper left corner. You will see the tests running and finally the status of the tests, which indicates the current health of you application from different scenarios. Technorati Tags: asp.net,architecture,starter kit,employee info starter kit,visual studio 2010,.net 4.0,entity framework

    Read the article

  • Getting Started with Employee Info Starter Kit (v4.0.0)

    - by Mohammad Ashraful Alam
    The new release of Employee Info Starter Kit contains lots of exciting features available in Visual Studio 2010 and .NET 4.0. To get started with the new version, you will need less than 5 minutes. Minimum System Requirements Before getting started, please make sure you have installed Visual Studio 2010 RC (or higher) and Sql Server 2005 Express edition (or higher installed on your machine. Running the Starter Kit for First Time 1. Download the starter kit 4.0.0 version form here and extract it. 2. Go to <extraction folder>\Source\Eisk.Solution and click the solution file 3. From the solution explorer, right click the “Eisk.Web” web site project node and select “Set as Startup Project” and hit Ctrl + F5   4. You will be prompted to install database, just follow the instruction. That’s it! You are ready to use this starter kit. Running the Tests Employee Info Starter Kit contains a infrastructure for Integration and Unit Testing, by utilizing cool test tools in Visual Studio 2010. Once you complete the steps, mentioned above, take a minute to run the test cases on the fly. 1. From the solution explorer, to go “Solution Items\e-i-s-k-2010.vsmdi” and click it. You will see the available Tests in the Visual Studio Test Lists. Select all, except the “Load Tests” node (since Load Tests takes a bit time) 2. Click “Run Checked Tests” control from the upper left corner. You will see the tests running and finally the status of the tests, which indicates the current health of you application from different scenarios. Technorati Tags: asp.net,architecture,starter kit,employee info starter kit,visual studio 2010,.net 4.0,entity framework

    Read the article

< Previous Page | 198 199 200 201 202 203 204 205 206 207 208 209  | Next Page >