Search Results

Search found 62759 results on 2511 pages for 'view first'.

Page 685/2511 | < Previous Page | 681 682 683 684 685 686 687 688 689 690 691 692  | Next Page >

  • Keep Track of Your Tasks with toDoo

    - by Asian Angel
    A tasks list can be convenient but most times you can not include details for those tasks or have to have an online account to do so. If you want to keep your tasks list with you on your computer or laptop and be able to add plenty of details then you might want to look at toDoo. Note: Requires Adobe AIR (download link at bottom of article). toDoo in Action Once you have installed toDoo everything is rather straightforward for getting started. The first time that you start toDoo there will be a temporary “fill-in” for the “Subject & Details Areas”. Simply highlight over the temporary text and add your information. Notice that if desired you can easily set a custom date and time for your tasks right below the “Details Area”. Note: toDoo does not minimize to the “System Tray”. Once you have everything set all that you need to do is click on “add task”. Here was our first new task being viewed in the “toDoo Description Tab”. Time to add a second task…here you can see the drop-down calendar. You can scroll through and select a different month very easily…just click on the desired day and it will be automatically set. Adding our second task… If you need to edit any of the details for a particular task you can do so in the “Edit toDoo Tab”. This nice little app is convenient and easy to use. Conclusion ToDoo is a simple straightforward app that lets you keep track of your tasks list and relevant details without an online account (especially helpful if you are without a wireless connection at a given moment). If you are looking for more of a list approach that runs on your desktop, then check out our article on Doomi here. Links Download ToDoo at Softpedia Download ToDoo at Adobe Marketplace Download Adobe AIR Similar Articles Productive Geek Tips Turn Chrome’s New Tab Page into a Google Tasks PageMake To-Do Bar in Outlook 2007 Show Only Today’s TasksAdd a non-Google Tasks List to ChromeKeep Track of Homework Assignments with SoshikuTrack the Amount of Time You Spend Online in Firefox TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Download Videos from Hulu Pixels invade Manhattan Convert PDF files to ePub to read on your iPad Hide Your Confidential Files Inside Images Get Wildlife Photography Tips at BBC’s PhotoMasterClasses Mashpedia is a Real-time Encyclopedia

    Read the article

  • Replication Services as ETL extraction tool

    - by jorg
    In my last blog post I explained the principles of Replication Services and the possibilities it offers in a BI environment. One of the possibilities I described was the use of snapshot replication as an ETL extraction tool: “Snapshot Replication can also be useful in BI environments, if you don’t need a near real-time copy of the database, you can choose to use this form of replication. Next to an alternative for Transactional Replication it can be used to stage data so it can be transformed and moved into the data warehousing environment afterwards. In many solutions I have seen developers create multiple SSIS packages that simply copies data from one or more source systems to a staging database that figures as source for the ETL process. The creation of these packages takes a lot of (boring) time, while Replication Services can do the same in minutes. It is possible to filter out columns and/or records and it can even apply schema changes automatically so I think it offers enough features here. I don’t know how the performance will be and if it really works as good for this purpose as I expect, but I want to try this out soon!” Well I have tried it out and I must say it worked well. I was able to let replication services do work in a fraction of the time it would cost me to do the same in SSIS. What I did was the following: Configure snapshot replication for some Adventure Works tables, this was quite simple and straightforward. Create an SSIS package that executes the snapshot replication on demand and waits for its completion. This is something that you can’t do with out of the box functionality. While configuring the snapshot replication two SQL Agent Jobs are created, one for the creation of the snapshot and one for the distribution of the snapshot. Unfortunately these jobs are  asynchronous which means that if you execute them they immediately report back if the job started successfully or not, they do not wait for completion and report its result afterwards. So I had to create an SSIS package that executes the jobs and waits for their completion before the rest of the ETL process continues. Fortunately I was able to create the SSIS package with the desired functionality. I have made a step-by-step guide that will help you configure the snapshot replication and I have uploaded the SSIS package you need to execute it. Configure snapshot replication   The first step is to create a publication on the database you want to replicate. Connect to SQL Server Management Studio and right-click Replication, choose for New.. Publication…   The New Publication Wizard appears, click Next Choose your “source” database and click Next Choose Snapshot publication and click Next   You can now select tables and other objects that you want to publish Expand Tables and select the tables that are needed in your ETL process In the next screen you can add filters on the selected tables which can be very useful. Think about selecting only the last x days of data for example. Its possible to filter out rows and/or columns. In this example I did not apply any filters. Schedule the Snapshot Agent to run at a desired time, by doing this a SQL Agent Job is created which we need to execute from a SSIS package later on. Next you need to set the Security Settings for the Snapshot Agent. Click on the Security Settings button.   In this example I ran the Agent under the SQL Server Agent service account. This is not recommended as a security best practice. Fortunately there is an excellent article on TechNet which tells you exactly how to set up the security for replication services. Read it here and make sure you follow the guidelines!   On the next screen choose to create the publication at the end of the wizard Give the publication a name (SnapshotTest) and complete the wizard   The publication is created and the articles (tables in this case) are added Now the publication is created successfully its time to create a new subscription for this publication.   Expand the Replication folder in SSMS and right click Local Subscriptions, choose New Subscriptions   The New Subscription Wizard appears   Select the publisher on which you just created your publication and select the database and publication (SnapshotTest)   You can now choose where the Distribution Agent should run. If it runs at the distributor (push subscriptions) it causes extra processing overhead. If you use a separate server for your ETL process and databases choose to run each agent at its subscriber (pull subscriptions) to reduce the processing overhead at the distributor. Of course we need a database for the subscription and fortunately the Wizard can create it for you. Choose for New database   Give the database the desired name, set the desired options and click OK You can now add multiple SQL Server Subscribers which is not necessary in this case but can be very useful.   You now need to set the security settings for the Distribution Agent. Click on the …. button Again, in this example I ran the Agent under the SQL Server Agent service account. Read the security best practices here   Click Next   Make sure you create a synchronization job schedule again. This job is also necessary in the SSIS package later on. Initialize the subscription at first synchronization Select the first box to create the subscription when finishing this wizard Complete the wizard by clicking Finish The subscription will be created In SSMS you see a new database is created, the subscriber. There are no tables or other objects in the database available yet because the replication jobs did not ran yet. Now expand the SQL Server Agent, go to Jobs and search for the job that creates the snapshot:   Rename this job to “CreateSnapshot” Now search for the job that distributes the snapshot:   Rename this job to “DistributeSnapshot” Create an SSIS package that executes the snapshot replication We now need an SSIS package that will take care of the execution of both jobs. The CreateSnapshot job needs to execute and finish before the DistributeSnapshot job runs. After the DistributeSnapshot job has started the package needs to wait until its finished before the package execution finishes. The Execute SQL Server Agent Job Task is designed to execute SQL Agent Jobs from SSIS. Unfortunately this SSIS task only executes the job and reports back if the job started succesfully or not, it does not report if the job actually completed with success or failure. This is because these jobs are asynchronous. The SSIS package I’ve created does the following: It runs the CreateSnapshot job It checks every 5 seconds if the job is completed with a for loop When the CreateSnapshot job is completed it starts the DistributeSnapshot job And again it waits until the snapshot is delivered before the package will finish successfully Quite simple and the package is ready to use as standalone extract mechanism. After executing the package the replicated tables are added to the subscriber database and are filled with data:   Download the SSIS package here (SSIS 2008) Conclusion In this example I only replicated 5 tables, I could create a SSIS package that does the same in approximately the same amount of time. But if I replicated all the 70+ AdventureWorks tables I would save a lot of time and boring work! With replication services you also benefit from the feature that schema changes are applied automatically which means your entire extract phase wont break. Because a snapshot is created using the bcp utility (bulk copy) it’s also quite fast, so the performance will be quite good. Disadvantages of using snapshot replication as extraction tool is the limitation on source systems. You can only choose SQL Server or Oracle databases to act as a publisher. So if you plan to build an extract phase for your ETL process that will invoke a lot of tables think about replication services, it would save you a lot of time and thanks to the Extract SSIS package I’ve created you can perfectly fit it in your usual SSIS ETL process.

    Read the article

  • Sudoku Solver

    - by merrillaldrich
    Today I am putting up something silly, just for fun. I set myself the task a while back to write a Sudoku solver in T-SQL, but with two dumb constraints that I would never follow given a real problem: I didn’t look at any documented techniques for solving Sudoku, and I specifically avoided T-SQL solutions, even though this has been done already many times. (The first thing I do with a real problem is to see who solved it already, and how, since most things have been done already. Not checking is...(read more)

    Read the article

  • Dual Boot ubuntu 12.04 and Windows 7 with on two separate SSDs with UEFI

    - by Björn
    With the following setup I get a blinking cursor after installation: Windows 7 64bit installed in first SSD (not UEFI, using MBR) Installation of Ubuntu 12.04 64Bit on gpt partioned disk seems to work without problems but does not boot. It stops with a blinking cursor. Partitioning scheme: sdb1 efi boot partition fat32 sdb2 root btrfs sdb3 home btrfs sdb4 swap Is it possible to mix uefi BIOS with MBR and gpt when using two separate SSDs? I tried grub2 into a MBR as well but it would not install there...

    Read the article

  • Keeping up with SQL Server cumulative updates

    - by AaronBertrand
    Yesterday, a conversation on twitter reminded me that I haven't been keeping up with posting cumulative updates. I missed these updates for SQL Server 2008 on March 15: Cumulative Update #7 for SQL Server 2008 SP1 (10.00.2766) Cumulative Update #10 for SQL Server 2008 RTM (10.00.1835) And yesterday Glenn Berry ( blog | twitter ) was the first I know of to blog about Cumulative Update #9 for SQL Server 2005 SP3 (9.00.4294). He also shares some interesting information about changes to the support policy...(read more)

    Read the article

  • SQL Server 2012 : Changes to system objects in RC0

    - by AaronBertrand
    As with every new major milestone, one of the first things I do is check out what has changed under the covers. Since RC0 was released yesterday, I've been poking around at some of the DMV and other system changes. Here is what I have noticed: New objects in RC0 that weren't in CTP3 Quick summary: We see a bunch of new aggregates for use with geography and geometry. I've stayed away from that area of programming so I'm not going to dig into them. There is a new extended procedure called sp_showmemo_xml....(read more)

    Read the article

  • April 2010 Critical Patch Update Released

    - by eric.maurice
    Hi, this is Eric Maurice. Today Oracle released the April 2010 Critical Patch Update (CPUApr2010),the first one to include security fixes for Oracle Solaris. Today's Critical Patch Update (CPU) provides 47 new security fixes across the following product families: Oracle Database Server, Oracle Fusion Middleware, Oracle Collaboration Suite, Oracle E-Business Suite, Oracle PeopleSoft Enterprise, Oracle Life Sciences, Retail, and Communications Industry Suites, and Oracle Solaris. 28 of these 47 new vulnerabilities are remotely exploitable without authentication, but the criticality of the affected components and the severity of these vulnerabilities vary greatly. Customers should, as usual, refer to the Risk Matrices in the CPU Advisory to assess the relevance of these fixes for their environment (and the urgency with which to apply the fixes). 7 of the 47 new vulnerabilities affect various versions of Oracle Database Server. None of these 7 vulnerabilities are remotely exploitable without authentication. Furthermore, none of these fixes are applicable to client-only deployments. The most severe CVSS Base Score for the Database Server vulnerabilities is 7.1. As a reminder, information about Oracle's use of the CVSS 2.0 standard can be found in Note 394487.1 (My Oracle Support subscription required). Note that this Critical Patch Update includes fixes for vulnerabilities that were publicly disclosed by David Litchfield at the BlackHat DC Conference in early February (CVE-2010-0866 and CVE-2010-0867). 5 of the 47 new vulnerabilities affect various components of the Oracle Fusion Middleware product family. The highest CVSS Base Score for these vulnerabilities is 7.5. Note that the patches for Oracle WebLogic Server are cumulative and this Critical Patch Update therefore also includes a fix for a vulnerability (CVE-2010-0073) that was the subject of a Security Alert issued by Oracle on February 4, 2010. Customers, who have not applied the previously-released patch, should apply today's Critical Patch Update as soon as possible. As stated at the beginning of this blog, it is also noteworthy to highlight that this Critical Patch Update provides 16 new fixes for the Sun product line. With the recent close of the Sun acquisition both security organizations have worked diligently to align Sun's previous security practices with Oracle's. Java users know that Oracle released a Critical Patch Update for Java SE and Java For Business earlier this month (in accordance with the Java patching schedule previously published by Sun Microsystems). Please note that for the first time, the Java advisories included CVSS Scores to help assess the severity of the new vulnerabilities fixed with the advisory. The rapid inclusion of the Solaris product lines in the Critical Patch Update and the extension of Oracle Software Security Assurance to Sun technologies are evidence of the flexibility of Oracle's security assurance programs. These should also result in tangible security benefits for the users of the Oracle hardware and software stack (such as a predictable patching schedule for all Oracle products).

    Read the article

  • GlassFish Back from Devoxx 2011 Mature Java EE 6 and EE 7 well on its way

    - by alexismp
    I'm back from my 8th (!) Devoxx conference (I don't think I've missed one since 2004) and this conference keeps delivering on the promise of a Java developer paradise week. GlassFish was covered in many different ways and I was not involved in a good number of them which can only be a good sign! Several folks asked me when my Java EE 6 session with Antonio Goncalves was scheduled (we've been covering this for the past two years in University sessions, hands-on labs and regular sessions). It turns out we didn't team up this year (Antonio was crazy busy preparing for Devoxx France) and I had a regular GlassFish session. Instead, this year, Bert Ertman and Paul Bakker covered the 3-hour Java EE 6 University session ("Duke’s Duct Tape Adventures") on the very first day (using GlassFish) with great success it seems. The Java EE 6 lab was also a hit with a full room of folks covering a lot of technical ground in 2.5 hours (with GlassFish of course). GlassFish was also mentioned during Cameron Purdy's keynote (pretty natural even if that surprised a number of folks that had not been closely following GlassFish) but also in Stephan Janssen's Keynote as the engine powering Parleys.com. In fact Stephan was a speaker in the GlassFish session describing how they went from a single-instance Tomcat setup to a clustered GlassFish + MQ environment. Also in the session was Johan Vos (of Mollom fame, along other things). Both of these customer testimonials were made possible because GlassFish has been delivering full Java EE 6 implementations for almost two years now which is plenty of time to see serious production deployments on it. The Java EE Gathering (BOF) was very well attended and very lively with many spec leads participating and discussing progress and also pain points with folks in the room. Thanks to all those attending this session, a good number of RFE's, and priority points came out of this. While this wasn't a GlassFish session by any means, it's great to have the current RESTful Admin and upcoming Java EE 7 planned features be a satisfactory answer to some of the requests from the attendance. Last but certainly not least, the GlassFish team is busy with Java EE 7 and version 4 of the product. This was discussed and shown during the Java EE keynote and in greater details in Jerome Dochez' session. If any indication, the tweets on his demo (virtualization, provisioning, etc...) were very encouraging. Java EE 6 adoption is doing great and GlassFish, being a production-quality reference implementation, is one of the first to benefit from this. And with GlassFish 4.0, we're looking at increasing the product and community adoption by offering a pragmatic technical solution to Java EE PaaS deployments. Stay tuned ! (the impatient in you is encouraged to grab a 4.0 build and provide feedback).

    Read the article

  • What are some handy tools in Windows that makes programmers life easy ? [closed]

    - by Omeid Herat
    I think there are some tools that almost every programmers needs so I though it might be useful if we can share it, please Name it and give an small description of the tools and link if possible. So here is the first one: WinMerge : WinMerge is an Open Source differencing and merging tool for Windows. WinMerge can compare both folders and files, presenting differences in a visual text format that is easy to understand and handle.

    Read the article

  • Add "My Dropbox" to Your Windows 7 Start Menu

    - by The Geek
    Over here at How-To Geek, we’re huge fans of Dropbox, the amazingly fast online file sync utility, but we’d be even happier if we could natively add it to the Windows 7 Start Menu, where it belongs. And today, that’s what we’ll do. Yep, that’s right. You can add it to the Start Menu… using a silly hack to the Libraries feature and renaming the Recorded TV library to a different name. It’s not a perfect solution, but you can access your Dropbox folder this way and it just seems to belong there. First things first, head into the Customize Start Menu panel by right-clicking on the start menu and using Properties, then make sure that Recorded TV is set to “Display as a link”. Next, right-click on Recorded TV, choose Rename, and then change it to something else like My Dropbox.   Now you’ll want to right-click on that button again, and choose Properties, where you’ll see the Library locations in the list… the general idea is that you want to remove Recorded TV, and then add your Dropbox folder. Oh, and you’ll probably want to make sure to set “Optimize this library for” to “General Items”. At this point, you can just click on My Dropbox, and you’ll see, well, Your Dropbox! (no surprise there). Yeah, I know, it’s totally a hack. But it’s a very useful one! Also, if you aren’t already using Dropbox, you should really check it out—2 GB for free, accessible via the web from anywhere, and you can sync to multiple desktops. Similar Articles Productive Geek Tips Use the Windows Key for the "Start" Menu in Ubuntu LinuxAccess Your Dropbox Files in Google ChromeSpeed up Windows Vista Start Menu Search By Limiting ResultsPin Any Folder to the Vista Start Menu the Easy WayEnable "Pin to Start Menu" for Folders in Windows Vista / XP TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Classic Cinema Online offers 100’s of OnDemand Movies OutSync will Sync Photos of your Friends on Facebook and Outlook Windows 7 Easter Theme YoWindoW, a real time weather screensaver Optimize your computer the Microsoft way Stormpulse provides slick, real time weather data

    Read the article

  • Membership in ASP.Net applications - part 1

    - by nikolaosk
    So far in all my posts, I have never mentioned anything about how to implement authentication/authorisation mechanisms in a web site. In all our professional web applications we do need some sort of mechanism to verify who are users are and what privileges have in our site. This is the first post in a series of posts investigating how to implement membership (authentication+authorisation) in ASP.Net applications. We will look into the built-in web server security controls.We will look at the built...(read more)

    Read the article

  • Building My Website 101 - Essential Components of Website Building

    Are you one of those persons who wanted to be part of a larger community like online sites? Why not try to create your own website. In building my website, I learned several things to be considered. Sometimes, we jump into conclusions when it comes to building a website that often results to a disaster. Hence, before trying to build a website, here are some of the factors you need to think of first.

    Read the article

  • KMenu & shell script

    - by allenskd
    I'm trying to make a very small shell script with a simple command and add it up to the KMenu. Well, thing is that once it launches the shell script, it closes it fast and I want to leave it open because the shell script attempts to create run a web application using a framework. I tried with this first #!/bin/bash play run /home/david/Projects/ZS then I tried with this #!/bin/bash konsole -e play run /home/david/Projects/ZSBlackboard In terminal, it runs perfectly, but in launcher.. not so much Any solution or suggestion is appreciated, thanks

    Read the article

  • Partition Wise Joins II

    - by jean-pierre.dijcks
    One of the things that I did not talk about in the initial partition wise join post was the effect it has on resource allocation on the database server. When Oracle applies a different join method - e.g. not PWJ - what you will see in SQL Monitor (in Enterprise Manager) or in an Explain Plan is a set of producers and a set of consumers. The producers scan the tables in the the join. If there are two tables the producers first scan one table, then the other. The producers thus provide data to the consumers, and when the consumers have the data from both scans they do the join and give the data to the query coordinator. Now that behavior means that if you choose a degree of parallelism of 4 to run such query with, Oracle will allocate 8 parallel processes. Of these 8 processes 4 are producers and 4 are consumers. The consumers only actually do work once the producers are fully done with scanning both sides of the join. In the plan above you can see that the producers access table SALES [line 11] and then do a PX SEND [line 9]. That is the producer set of processes working. The consumers receive that data [line 8] and twiddle their thumbs while the producers go on and scan CUSTOMERS. The producers send that data to the consumer indicated by PX SEND [line 5]. After receiving that data [line 4] the consumers do the actual join [line 3] and give the data to the QC [line 2]. BTW, the myth that you see twice the number of processes due to the setting PARALLEL_THREADS_PER_CPU=2 is obviously not true. The above is why you will see 2 times the processes of the DOP. In a PWJ plan the consumers are not present. Instead of producing rows and giving those to different processes, a PWJ only uses a single set of processes. Each process reads its piece of the join across the two tables and performs the join. The plan here is notably different from the initial plan. First of all the hash join is done right on top of both table scans [line 8]. This query is a little more complex than the previous so there is a bit of noise above that bit of info, but for this post, lets ignore that (sort stuff). The important piece here is that the PWJ plan typically will be faster and from a PX process number / resources typically cheaper. You may want to look out for those plans and try to get those to appear a lot... CREDITS: credits for the plans and some of the info on the plans go to Maria, as she actually produced these plans and is the expert on plans in general... You can see her talk about explaining the explain plan and other optimizer stuff over here: ODTUG in Washington DC, June 27 - July 1 On the Optimizer blog At OpenWorld in San Francisco, September 19 - 23 Happy joining and hope to see you all at ODTUG and OOW...

    Read the article

  • Global Webcast: Increase Pharmaceutical Sales Effectiveness

    - by charles.knapp
    See a next-generation approach to Pharmaceutical sales challenges! • Increase the quality of sales interactions with enhanced call planning and eDetailing • Improve sample management with electronic signature storage and inventory tracking on the go • Increase marketing effectiveness with closed loop marketing and personalized content delivery Watch as senior vice president of CRM, Anthony Lye, and director of life sciences product strategy, Piers Evans, provide the first public look at Oracle's new Pharmaceutical Sales On The Go solution, powered by Oracle CRM On Demand Release 17 -- Life Sciences Edition. Register now for this informative GLOBAL webcast on March 31, 9 AM PDT/4 PM GMT.

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 21 (sys.dm_db_partition_stats)

    - by Tamarick Hill
    The sys.dm_db_partition_stats DMV returns page count and row count information for each table or index within your database. Lets have a quick look at this DMV so we can review some of the results. **NOTE: I am going to create an ‘ObjectName’ column in our result set so that we can more easily identify tables. SELECT object_name(object_id) ObjectName, * FROM sys.dm_db_partition_stats As stated above, the first column in our result set is an Object name based on the object_id column of this result set. The partition_id column refers to the partition_id of the index in question. Each index will have at least 1 unique partition_id and will have more depending on if the object has been partitioned. The index_id column relates back to the sys.indexes table and uniquely identifies an index on a given object. A value of 0 (zero) in this column would indicate the object is a HEAP and a value of 1 (one) would signify the Clustered Index. Next is the partition_number which would signify the number of the partition for a particular object_id. Since none of my tables in my result set have been partitioned, they all display 1 for the partition_number. Next we have the in_row_data_page_count which tells us the number of data pages used to store in-row data for a given index. The in_row_used_page_count is the number of pages used to store and manage the in-row data. If we look at the first row in the result set, we will see we have 700 for this column and 680 for the previous. This means that just to manage the data (not store it) is requiring 20 pages. The next column in_row_reserved_page_count is how many pages have been reserved, regardless if they are being used or not. The next 2 columns are used for storing LOB (Large Object) data which could be text, image, varchar(max), or varbinary(max) columns. The next two columns, row_overflow, represent pages used for data that exceed the 8,060 byte row size limit for the in-row data pages. The next columns used_page_count and reserved_page_count represent the sum of the in_row, lob, and row_overflow columns discussed earlier. Lastly is a row_count column which displays the number of rows that are in a particular index. This DMV is a very powerful resource for identifying page and row count information. By knowing the page counts for indexes within your database, you are able to easily calculate the size of indexes. For more information on this DMV, please see the below Books Online link: http://msdn.microsoft.com/en-us/library/ms187737.aspx Follow me on Twitter @PrimeTimeDBA

    Read the article

  • IIS7 + WCF + Silverlight problems

    - by Eanna
    Hey, I've been building a silverlight application and a WCF service for a while now and recently tried to host them in IIS7. I installed IIS7 on Windows Server 2008 R2 and added these two application to my default website. I am having a number of problems so im hoping one of you can help out... 1) The silverlight and WCF service applications do not work with pass-through authentication. I need to "connect as" the administrator server account when setting up the application. I read online that you should only need to use the "connect as" field when you are connecting to another computer. If i dont supply the admin credentials i get this error. Do i have to set up permissions somewhere else? HTTP Error 500.19 - Internal Server Error The requested page cannot be accessed because the related configuration data for the page is invalid. Detailed Error Information Module IIS Web Core Notification BeginRequest Handler Not yet determined Error Code 0x80070005 Config Error Cannot read configuration file due to insufficient permissions Config File \?\C:\Users\Administrator\Documents\My Dropbox\Research Masters\Project\WCFService\Website\web.config Requested URL http:://localhost:80/WCFService/Service.svc Physical Path C:\Users\Administrator\Documents\My Dropbox\Research Masters\Project\WCFService\Website\Service.svc Logon Method Not yet determined Logon User Not yet determined Config Source -1: 0: Links and More Information This error occurs when there is a problem reading the configuration file for the Web server or Web application. In some cases, the event logs may contain more information about what caused this error. 2) Visual studio generated 2 webpages to run my silverlight application (.html and .aspx). When I am running the silverlight application (connected as admin) I can navigate to the .html page, no problem. When I try to open the .aspx file i get the following error Server Error in '/Platform' Application. Access is denied. Description: An error occurred while accessing the resources required to serve this request. You might not have permission to view the requested resources. Error message 401.3: You do not have permission to view this directory or page using the credentials you supplied (access denied due to Access Control Lists). Ask the Web server's administrator to give you access to 'C:\Users\Administrator\Documents\My Dropbox\Research Masters\Project\Platform\Website\PlatformTestPage.aspx'. Version Information: Microsoft .NET Framework Version:4.0.30128; ASP.NET Version:4.0.30128.1 3) The WCF service runs fine (again, connected as admin) until i restart the server. When i try to run the WCF service after a reboot, the mysql assembly seems to be missing from the solution. If i just rebuilt the solution and run the service again... it works (until next restart). Whats causing this error? Solution here - http://tinypic.com/view.php?pic=5yasqx&s=5 Server Error in '/WCFService' Application. Could not load file or assembly 'MySql.Data, Version=6.2.2.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d' or one of its dependencies. Access is denied. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.IO.FileLoadException: Could not load file or assembly 'MySql.Data, Version=6.2.2.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d' or one of its dependencies. Access is denied. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Assembly Load Trace: The following information can be helpful to determine why the assembly 'MySql.Data, Version=6.2.2.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d' could not be loaded. WRN: Assembly binding logging is turned OFF. To enable assembly bind failure logging, set the registry value [HKLM\Software\Microsoft\Fusion!EnableLog] (DWORD) to 1. Note: There is some performance penalty associated with assembly bind failure logging. To turn this feature off, remove the registry value [HKLM\Software\Microsoft\Fusion!EnableLog]. Stack Trace: [FileLoadException: Could not load file or assembly 'MySql.Data, Version=6.2.2.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d' or one of its dependencies. Access is denied.] System.Reflection.RuntimeAssembly._nLoad(AssemblyName fileName, String codeBase, Evidence assemblySecurity, RuntimeAssembly locationHint, StackCrawlMark& stackMark, Boolean throwOnFileNotFound, Boolean forIntrospection, Boolean suppressSecurityChecks) +0 System.Reflection.RuntimeAssembly.InternalLoadAssemblyName(AssemblyName assemblyRef, Evidence assemblySecurity, StackCrawlMark& stackMark, Boolean forIntrospection, Boolean suppressSecurityChecks) +567 System.Reflection.RuntimeAssembly.InternalLoad(String assemblyString, Evidence assemblySecurity, StackCrawlMark& stackMark, Boolean forIntrospection) +192 System.Reflection.Assembly.Load(String assemblyString) +35 System.ServiceModel.Activation.ServiceHostFactory.CreateServiceHost(String constructorString, Uri[] baseAddresses) +243 System.ServiceModel.HostingManager.CreateService(String normalizedVirtualPath) +1423 System.ServiceModel.HostingManager.ActivateService(String normalizedVirtualPath) +50 System.ServiceModel.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath) +1132 [ServiceActivationException: The service '/WCFService/Service.svc' cannot be activated due to an exception during compilation. The exception message is: Could not load file or assembly 'MySql.Data, Version=6.2.2.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d' or one of its dependencies. Access is denied..] System.Runtime.AsyncResult.End(IAsyncResult result) +889824 System.ServiceModel.Activation.HostedHttpRequestAsyncResult.End(IAsyncResult result) +179150 System.Web.AsyncEventExecutionStep.OnAsyncEventCompletion(IAsyncResult ar) +107 Version Information: Microsoft .NET Framework Version:4.0.30128; ASP.NET Version:4.0.30128.1 Thats about it, hope someone reads this message, I wasted most of the weekend trying to fix these problems on my own... thanks

    Read the article

  • ASP.NET, Web API and ASP.NET Web Pages Open Sourced

    - by The Official Microsoft IIS Site
    More Open Source goodness from Microsoft today, with the announcement that we are open sourcing ASP.NET MVC 4, ASP.NET Web API, ASP.NET Web Pages v2 (Razor) - all with contributions - under the Apache 2.0 license. You can find the source on CodePlex , and all the details on Scott Guthrie's blog . “We will also for the first time allow developers outside of Microsoft to submit patches and code contributions that the Microsoft development team will review for potential inclusion in the products...(read more)

    Read the article

  • SQL SERVER – SSMS: Disk Usage Report

    - by Pinal Dave
    Let us start with humor!  I think we the series on various reports, we come to a logical point. We covered all the reports at server level. This means the reports we saw were targeted towards activities that are related to instance level operations. These are mostly like how a doctor diagnoses a patient. At this point I am reminded of a dialog which I read somewhere: Patient: Doc, It hurts when I touch my head. Doc: Ok, go on. What else have you experienced? Patient: It hurts even when I touch my eye, it hurts when I touch my arms, it even hurts when I touch my feet, etc. Doc: Hmmm … Patient: I feel it hurts when I touch anywhere in my body. Doc: Ahh … now I get it. You need a plaster to your finger John. Sometimes the server level gives an indicator to what is happening in the system, but we need to get to the root cause for a specific database. So, this is the first blog in series where we would start discussing about database level reports. To launch database level reports, expand selected server in Object Explorer, expand the Databases folder, and then right-click any database for which we want to look at reports. From the menu, select Reports, then Standard Reports, and then any of database level reports. In this blog, we would talk about four “disk” reports because they are similar: Disk Usage Disk Usage by Top Tables Disk Usage by Table Disk Usage by Partition Disk Usage This report shows multiple information about the database. Let us discuss them one by one.  We have divided the output into 5 different sections. Section 1 shows the high level summary of the database. It shows the space used by database files (mdf and ldf). Under the hood, the report uses, various DMVs and DBCC Commands, it is using sys.data_spaces and DBCC SHOWFILESTATS. Section 2 and 3 are pie charts. One for data file allocation and another for the transaction log file. Pie chart for “Data Files Space Usage (%)” shows space consumed data, indexes, allocated to the SQL Server database, and unallocated space which is allocated to the SQL Server database but not yet filled with anything. “Transaction Log Space Usage (%)” used DBCC SQLPERF (LOGSPACE) and shows how much empty space we have in the physical transaction log file. Section 4 shows the data from Default Trace and looks at Event IDs 92, 93, 94, 95 which are for “Data File Auto Grow”, “Log File Auto Grow”, “Data File Auto Shrink” and “Log File Auto Shrink” respectively. Here is an expanded view for that section. If default trace is not enabled, then this section would be replaced by the message “Trace Log is disabled” as highlighted below. Section 5 of the report uses DBCC SHOWFILESTATS to get information. Here is the enhanced version of that section. This shows the physical layout of the file. In case you have In-Memory Objects in the database (from SQL Server 2014), then report would show information about those as well. Here is the screenshot taken for a different database, which has In-Memory table. I have highlighted new things which are only shown for in-memory database. The new sections which are highlighted above are using sys.dm_db_xtp_checkpoint_files, sys.database_files and sys.data_spaces. The new type for in-memory OLTP is ‘FX’ in sys.data_space. The next set of reports is targeted to get information about a table and its storage. These reports can answer questions like: Which is the biggest table in the database? How many rows we have in table? Is there any table which has a lot of reserved space but its unused? Which partition of the table is having more data? Disk Usage by Top Tables This report provides detailed data on the utilization of disk space by top 1000 tables within the Database. The report does not provide data for memory optimized tables. Disk Usage by Table This report is same as earlier report with few difference. First Report shows only 1000 rows First Report does order by values in DMV sys.dm_db_partition_stats whereas second one does it based on name of the table. Both of the reports have interactive sort facility. We can click on any column header and change the sorting order of data. Disk Usage by Partition This report shows the distribution of the data in table based on partition in the table. This is so similar to previous output with the partition details now. Here is the query taken from profiler. SELECT row_number() OVER (ORDER BY a1.used_page_count DESC, a1.index_id) AS row_number ,      (dense_rank() OVER (ORDER BY a5.name, a2.name))%2 AS l1 ,      a1.OBJECT_ID ,      a5.name AS [schema] ,       a2.name ,       a1.index_id ,       a3.name AS index_name ,       a3.type_desc ,       a1.partition_number ,       a1.used_page_count * 8 AS total_used_pages ,       a1.reserved_page_count * 8 AS total_reserved_pages ,       a1.row_count FROM sys.dm_db_partition_stats a1 INNER JOIN sys.all_objects a2  ON ( a1.OBJECT_ID = a2.OBJECT_ID) AND a1.OBJECT_ID NOT IN (SELECT OBJECT_ID FROM sys.tables WHERE is_memory_optimized = 1) INNER JOIN sys.schemas a5 ON (a5.schema_id = a2.schema_id) LEFT OUTER JOIN  sys.indexes a3  ON ( (a1.OBJECT_ID = a3.OBJECT_ID) AND (a1.index_id = a3.index_id) ) WHERE (SELECT MAX(DISTINCT partition_number) FROM sys.dm_db_partition_stats a4 WHERE (a4.OBJECT_ID = a1.OBJECT_ID)) >= 1 AND a2.TYPE <> N'S' AND  a2.TYPE <> N'IT' ORDER BY a5.name ASC, a2.name ASC, a1.index_id, a1.used_page_count DESC, a1.partition_number Using all of the above reports, you should be able to get the usage of database files and also space used by tables. I think this is too much disk information for a single blog and I hope you have used them in the past to get data. Do let me know if you found anything interesting using these reports in your environments. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL Tagged: SQL Reports

    Read the article

  • New way of creating web applications on Visual Studio 2013

    - by DigiMortal
    Yesterday Visual Studio 2013 Preview was released and now it’s time to play with it. First thing I noticed was the new way how to create web applications. For all web applications there is generic dialog where you can set all important options for your new web application before it is created. Let’s see how it works. Also let’s take a look at new blue theme of Visual Studio 2013. Read more from my new blog @ gunnarpeipman.com

    Read the article

  • Silverlight Cream for May 05, 2010 -- #856

    - by Dave Campbell
    In this Issue: Jeremy Alles(-2-), Kunal Chowdhury, anand iyer, Yochay Kiriaty(-2-, -3-), Max Paulousky, David Kelley, smartyP, Tim Heuer, and Dan Wahlin. Shoutout: Tim Heuer provides links for all the Ways to give feedback on Silverlight From SilverlightCream.com: [WP7] Bug when using NavigationService in Windows Phone 7 Jeremy Alles has blogged about a bug he found using the Navigation service in WP7. He gives the steps to reproduce and a couple possible workarounds. [WP7] Using the camera in the emulator Jeremy Alles is also digging into the camera functionality in the emulator. He has code demonstrating launching a camera task, and a list of other tasks available. Silverlight Tutorials Chapter 3: Introduction to Panels Kunal Chowdhury has Chapter 3 of his Silverlight 4 Tutorial series up and he's talking about Panels this time out. Push Notifications in Windows Phone 7 developer tools CTP April Refresh anand iyer is discussing the Push Notifications, only from a code perspective. Good information and good additional links to follow. Windows Phone Application Life Cycle Yochay Kiriaty talks with Tudor Toma and Jaime Rodriguez about the WP7 application lifecycle on Channel 9. Understanding Microsoft Push Notifications for Windows Phones Yochay Kiriaty has a 2-part post up on WP7 Push Notifications. The first part is explaining what Push Notifications are and why we need them... as a developer and as an end user viewing Toast or Tile notifications. Understanding How Microsoft Push Notification Works – Part 2 In the 2nd part of his Push Notification series, Yochay Kiriaty discusses how the Push Notification works under the covers. To Remember: Deployment of Silverlight Applications With Wcf Ria Services Max Paulousky has a post up for reference on what to look into when you get "Load Operation Failed" in WCF RIA services. Launching a URL from an OOB Silverlight Application David Kelley has a quick post up on launching URLs from an OOB app. If you haven't tried it, you may be surprised as he was at first. Creating a Windows Phone 7 XNA Game in Landscape Orientation smartyP is looking at recreating a landscape WP7 game in XNA and is detailing some of the issues he's been dealing with, and is also sharing a project file. New Silverlight 4 Themes available–get the raw bits Tim Heuer provided 'raw' versions of 3 new themes. Read his post to see exactly what he means by 'raw' ... they're definitely good looking, and are going to get a lot of play. Handling WCF Service Paths in Silverlight 4 – Relative Path Support Dan Wahlin shares his technique for avoiding the pain involved with ServiceReferences.ClientConfig by using Silverlight 4 relative path support. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Auto Mocking using JustMock

    - by mehfuzh
    Auto mocking containers are designed to reduce the friction of keeping unit test beds in sync with the code being tested as systems are updated and evolve over time. This is one sentence how you define auto mocking. Of course this is a more or less formal. In a more informal way auto mocking containers are nothing but a tool to keep your tests synced so that you don’t have to go back and change tests every time you add a new dependency to your SUT or System Under Test. In Q3 2012 JustMock is shipped with built in auto mocking container. This will help developers to have all the existing fun they are having with JustMock plus they can now mock object with dependencies in a more elegant way and without needing to do the homework of managing the graph. If you are not familiar with auto mocking then I won't go ahead and educate you rather ask you to do so from contents that is already made available out there from community as this is way beyond the scope of this post. Moving forward, getting started with Justmock auto mocking is pretty simple. First, I have to reference Telerik.JustMock.Container.DLL from the installation folder along with Telerik.JustMock.DLL (of course) that it uses internally and next I will write my tests with mocking container. It's that simple! In this post first I will mock the target with dependencies using current method and going forward do the same with auto mocking container. In short the sample is all about a report builder that will go through all the existing reports, send email and log any exception in that process. This is somewhat my  report builder class looks like: Reporter class depends on the following interfaces: IReporBuilder: used to  create and get the available reports IReportSender: used to send the reports ILogger: used to log any exception. Now, if I just write the test without using an auto mocking container it might end up something like this: Now, it looks fine. However, the only issue is that I am creating the mock of each dependency that is sort of a grunt work and if you have ever changing list of dependencies then it becomes really hard to keep the tests in sync. The typical example is your ASP.NET MVC controller where the number of service dependencies grows along with the project. The same test if written with auto mocking container would look like: Here few things to observe: I didn't created mock for each dependencies There is no extra step creating the Reporter class and sending in the dependencies Since ILogger is not required for the purpose of this test therefore I can be completely ignorant of it. How cool is that ? Auto mocking in JustMock is just released and we also want to extend it even further using profiler that will let me resolve not just interfaces but concrete classes as well. But that of course starts the debate of code smell vs. working with legacy code. Feel free to send in your expert opinion in that regard using one of telerik’s official channels. Hope that helps

    Read the article

< Previous Page | 681 682 683 684 685 686 687 688 689 690 691 692  | Next Page >