Search Results

Search found 18580 results on 744 pages for 'visual studio extensions'.

Page 158/744 | < Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >

  • Setting XSL-FO XML Schema in Visual Studio

    - by Lukasz Kurylo
    I'm playing lately with an XSL-FO for generating a pdf documents. XSL-FO has a long list of available tags and attributes, which for a new guy who want to create a simple document is a nightmare to find a proper one. Fortunatelly we can set an schema for XSL-FO, so will result in acquire a full intellisense in VS. For a simple *.fo file, we can set the path to the schema directly in file: <?xml version="1.0" encoding="utf-8"?> <fo:root       xmlns:fo="http://www.w3.org/1999/XSL/Format"       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"       xsi:schemaLocation=" http://www.w3.org/1999/XSL/Format http://www.xmlblueprint.com/documents/fop.xsd"> ...   We can of course use the build in VS XML Schemas selector. To use it, we must copy the schema file to the Schemas catalog (defaut path for VS2012 is C:\Program Files (x86)\Microsoft Visual Studio 11.0\Xml\Schemas). Then we can go to Properties of the opened xml/xslt file and set the new added schema to file:                 From now, we should have an enable intellisense as shown below: .

    Read the article

  • Speed up loading of test results from builds in Visual Studio

    - by Jakob Ehn
    I still see people complaining about the long time it takes to load test results from a TFS build in Visual Studio. And they make a valid point, it does take a very long time to load the test results, even for a small number of tests. The reason for this is that the test results is not just the result of the test run but also all the binaries that were part of the test run. This often also means that the debug symbols (*.pdb) will be downloaded to your local machine. This reason for this behaviour is that it letsyou re-run the tests locally. However, most of the times this is not what the developer will do, they just want to know which tests failed and why. They can then fix the tests and rerun them locally. It turns out there is a way to load only the test results, which is much faster. The only tricky bit is to find the location of the .trx file that is generated during the build. Particularly in TFS 2010 where you often have multiple build agents, which of corse results in different paths to the trx file. Note: To use this you must have read permission to the build folder on the build agent where the build was executed. Open the build result for the build Click View Log Locate the part where MSTest is invoked. When using test containers, it looks like this:   Note: You can actually search in the log window, press Ctrl+F and you will get a little search box at the bottom. Nice! On the MSTest command line call, locate the /resultsfileroot parameter, which points to the folder where the test results are stored Note that this path is local for the build server, so you need to replace the drive letter with the server name: D:\Builds\Project\TestResults to \Project\TestResults">\\<BuildServer>\Project\TestResults Double-click on the .trx file and you will notice that it loads much faster compared to opening it from the build log window

    Read the article

  • APress Deal of the Day - 1/June/2012 - Introducing Visual C# 2010

    - by TATWORTH
    Today's $10 Deal of the Day from APress at http://www.apress.com/9781430231714 is Introducing Visual C# 2010."If you're new to C# programming, this book is the ideal way to get started. Respected author Adam Freeman guides you through the C# language by carefully building up your knowledge from fundamental concepts to advanced features." Adam Freeman is an excellent author. This is an excellent introduction to C# programming and a manual for those with experience. Having read through book, I am very impressed by its practical approach to C#. I cannot improve on the by-line "Get started on your C# journey with an expert by your side leading by example" Adam Freeman teaches C# by precept and example. I suspect he drives a Volvo C30 as it comes up in many of the code examples! Throughout the book there are numerous links back and forth so as to avoid over complicating the current topic. I have have no hesitation in recommending this book both to programmers starting out with C# and to the seasoned professional. It is a book that should be on every C# development team's book shelf.

    Read the article

  • Sample Browser Visual Studio Extension is localized and introduced to Japan

    - by Jialiang
    http://blogs.msdn.com/b/codefx/archive/2012/10/14/sample-browser-visual-studio-extension-is-localized-and-introduced-to-japan.aspx  ??????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????From: Japan MVP   "The Sample Browser is very easy to use thanks to the refined interface.  The categorized menu enables faster search. Highly acclaimed.  But it need localization. It may not be a problem for those who can understand English, but I think localizing Sample Browser into Japanese will promote its use in Japan further." This is a prominent feedback collected from the Japan MVP community since we released the last version of Sample Browser, which was only available in English.  Japan developers like the Sample Browser, but they want localized code samples, localized Sample Browser UI, and the localized search experience.  The Japan MVP lead, Satoru Kitabata, observed these needs and expectations.  He started to engage with all local developer MVPs to translate the UI elements in the Sample Browser.  Lots of MVPs signed up to participate in this work.  They had roundtables and newsletters to track the progress.  In short three weeks, every control, every tooltip, every font on every label, was beautifully tuned for Japanese.  The sample search experience was also optimized for Japan developers - they can directly type Japanese query to search for code samples.  Together with Microsoft Japan MVPs, the sample use experience is localized and improved to a new level!    The Japan MVP Lead, Satoru Kitabata, further worked with MSDN Japan site manager and Japan DPE to introduce the good news of localized Sample Browser to Japan Sample Browser  http://msdn.microsoft.com/ja-jp/jj730399 Sample Browser?????? http://msdn.microsoft.com/ja-jp/jj730398     Thanks to the joint effort and Japan MVPs’ feedback and contributions, the Sample Browser gets the chance to benefit the broader Japan developer audience.

    Read the article

  • NHibernate Tools: Visual NHibernate

    - by Ricardo Peres
    You probably know that I’m a big fan of Slyce Software’s Visual NHibernate. To me, it is the best tool for generating your entities and mappings from an existing database (it also allows you to go the other way, but I honestly have never used it that way). What I like most about it: Great support: folks at Slyce always listen to your suggestions, give you feedback in a timely manner, and I was even lucky enough to have some of my suggestions implemented! The templating engine, which is very powerful, and more user-friendly than, for example, MyGeneration’s; one of the included templates is Sharp Architecture; Advanced model validations: it even warns you about having lazy properties declared in non-lazy entities; Integration with NHibernate Validator and generation of validation rules automatically based on the database, or on user-defined model settings; The designer: they opted for not displaying all entities in a single screen, which I think was a good decision; has support for all inheritance strategies (table per class hierarchy, table per class, table per concrete class); Generation of FluentNHibernate mappings as well as hbm.xml. I could name others, but… why don’t you see for yourself? There is a demo version available for downloading. By the way, I am in no way related to Slyce, I just happen to like their software!

    Read the article

  • Visual WebGui launches a new prize-winning challenge for developers

    - by Webgui
    Gizmox is announcing a ListView Challenge where developers can participate by creating and submitting their own implementations of the new extended ListView. "its quite amazing what you can do with it. It opens a lot of new ways to present data in a better and more userfriendly way," says one of the VWG community members who built a three level hierarchal ListView. Watch the hierarchal ListView demo by Visualizer Those ListView implementations will be reviewed and rated and the winner will win a free Professional Studio license $750 worth. The 5 top rated codes will entitle their developers for a cool new T-shirt. The new v6.4 introduces new capabilities with its extended ListView Control. Enter the Challenge The Collapsible Panel enhancement of the ListView Control, along with the Column Type Control, open up the possibilities for potential usage of the ListView control for data display, data entry and as the Collapsible Panel can contain whatever control you like, it can as well contain other ListView controls, thus making it possible to create Hierarchial ListView display of unlimited number of levels. The first enhancement is the introduction of a new column type Control which opens up the possibility for a ListView cell to contain controls like CheckBox, ComboBox, ListBox or even TabControl, Form or another ListView as the contents of that particular cell. This means that the ListView is no longer a display-only control, but has the full potential of being a full blown data entry control as well. The second major enhancement is the introduction of ListViewPanelItem. The ListViewPanelItem behaves exactly the same as it‘s predecessor, the ListViewItem, and in additon it has a Panel Control attached to it, seperate panel for each row in the ListView. This new Panel can be either expanded (visible) or not (hidden) and when expanded, will fill the full width of the ListView, but has adjustable height. Watch a webcast about the extended ListView

    Read the article

  • Cannot get temperatures in Dell Studio 1558

    - by Athul Iddya
    I could never get proper temperatures on my Dell Studio 1558. lm-sensors and acpi give wrong readings. The output of sensors is, $ sensors acpitz-virtual-0 Adapter: Virtual device temp1: +26.8°C (crit = +100.0°C) temp2: +0.0°C (crit = +100.0°C) acpi -V gives me, $ acpi -V Battery 0: Full, 100% Battery 0: design capacity 414 mAh, last full capacity 369 mAh = 89% Adapter 0: on-line Thermal 0: ok, 0.0 degrees C Thermal 0: trip point 0 switches to mode critical at temperature 100.0 degrees C Thermal 0: trip point 1 switches to mode passive at temperature 95.0 degrees C Thermal 0: trip point 2 switches to mode active at temperature 71.0 degrees C Thermal 0: trip point 3 switches to mode active at temperature 55.0 degrees C Thermal 1: ok, 26.8 degrees C Thermal 1: trip point 0 switches to mode critical at temperature 100.0 degrees C Thermal 1: trip point 1 switches to mode active at temperature 71.0 degrees C Thermal 1: trip point 2 switches to mode active at temperature 55.0 degrees C Cooling 0: LCD 0 of 15 Cooling 1: Processor 0 of 10 Cooling 2: Processor 0 of 10 Cooling 3: Processor 0 of 10 Cooling 4: Processor 0 of 10 Cooling 5: Fan 0 of 1 Cooling 6: Fan 0 of 1 I suspect even hddtemp gives bogus readings as its always at 46 $ sudo hddtemp /dev/sda /dev/sda: ST9500420AS: 46°C I have gone through some bug reports and some used to have the same problem after resuming from suspend. But I always have this problem. I had updated to the latest BIOS from Windows a couple of weeks ago, will updating from Ubuntu change anything? CORRECTION: hddtemp's readings do change. Its now at 45.

    Read the article

  • Visual Studio 2008 doesn't create *.refresh files for external DLL references... what am I missing?

    - by Cory Larson
    Hi all-- I've got a question about something that's just been irritating me. A colleague and I are building a support framework for our current client that we want to reference in other projects. The DLL we want as a reference in our project would be an external reference. We're adding it by doing "Add Reference...", then browsing to the location of the .dll. What I want Visual Studio to do is only add the .xml, .pdb, and a .dll.refresh file, but instead it copies the actual .dll (and .xml and .pdb) into the bin. When we rebuild the framework project, the other project that uses its .dll gets all out of whack until we drop and re-add the reference. Everything I've read online says that VS2008 is supposed to create the .dll.refresh files for you, but it never does. Any ideas? Am I missing something or doing something wrong? At this point I'm ready to add a pre-build event to simply copy the framework .dll into my bin, but the .refresh file seems like less of a hassle if it would just work. Thanks, Cory

    Read the article

  • How to avoid visual artifacts when hosting WPF user controls within a WinForms MDI app?

    - by jpierson
    When hosting WPF user controls within a WinForms MDI app there is a drawing issue when you have multiple forms that overlap each other that causes very distinct visual artifacts. These artifacts are mostly visible after dragging one child form over another one that also hosts WPF content or by allowing the edges of the child form to be clipped by the main MDI parent when dragging it around. After the drag and drop of the child form is completed the artifacts stay around gernally but I've found that setting focus to a different application's window and then refocusing back on to my application window that it is redrawn and all is good again until the child forms are moved once again. Please see the image below which demonstrates the problem. Since many at Microsoft insist that the WinForms MDI is already a complete solution for MDI even when dealing with WPF I find it hard to believe any of them have ever actually tried creating a mostly WPF app that utilizes WinForms MDI otherwise it would be hard to recommend while keeping a straignt face. I'm hoping to either come up with proof that this solution truly is not acceptable or possibly find a way to overcome this and a few other specific issues.

    Read the article

  • Unable to run Ajax Minifier as post-build in Visual Studio.

    - by James South
    I've set up my post build config as demonstrated at http://www.asp.net/ajaxlibrary/ajaxminquickstart.ashx I'm getting the following error though: The "JsSourceFiles" parameter is not supported by the "AjaxMin" task. Verify the parameter exists on the task, and it is a settable public instance property. My configuration settings...... <Import Project="$(MSBuildExtensionsPath)\Microsoft\MicrosoftAjax\ajaxmin.tasks" /> <Target Name="AfterBuild"> <ItemGroup> <JS Include="**\*.js" Exclude="**\*.min.js" /> </ItemGroup> <ItemGroup> <CSS Include="**\*.css" Exclude="**\*.min.css" /> </ItemGroup> <AjaxMin JsSourceFiles="@(JS)" JsSourceExtensionPattern="\.js$" JsTargetExtension=".min.js" CssSourceFiles="@(CSS)" CssSourceExtensionPattern="\.css$" CssTargetExtension=".min.css" /> </Target> I had a look at the AjaxMinTask.dll with reflector and noted that the publicly exposed properties do not match the ones in my config. There is an array of ITaskItem called SourceFiles though so I edited my configuration to match. <Import Project="$(MSBuildExtensionsPath)\Microsoft\MicrosoftAjax\ajaxmin.tasks" /> <Target Name="AfterBuild"> <ItemGroup> <JS Include="**\*.js" Exclude="**\*.min.js" /> </ItemGroup> <ItemGroup> <CSS Include="**\*.css" Exclude="**\*.min.css" /> </ItemGroup> <AjaxMin SourceFiles="@(JS);@(CSS)" SourceExtensionPattern="\.js$;\.css$" TargetExtension=".min.js;.min.css"/> </Target> I now get the error: The "SourceFiles" parameter is not supported by the "AjaxMin" task. Verify the parameter exists on the task, and it is a settable public instance property. I'm scratching my head now. Surely it should be easier than this? I'm running Visual Studio 2010 Ultimate on a Windows 7 64 bit installation.

    Read the article

  • "Illegal characters in path." Visual Studio WinForm Design View

    - by jacksonakj
    I am putting together a lightweight MVP pattern for a WinForms project. Everything compiles and runs fine. However when I attempt to open the WinForm in design mode in Visual Studio I get a "Illegal characters in path" error. My WinForm is using generics and inheriting from a base Form class. Is there a problem with using generics in a WinForm? Here is the WinForm and base Form class. public partial class TapsForm : MvpForm<TapsPresenter, TapsFormModel>, ITapsView { public TapsForm() { InitializeComponent(); } public TapsForm(TapsPresenter presenter) :base(presenter) { InitializeComponent(); UpdateModel(); } public IList<Taps> Taps { set { gridTaps.DataSource = value; } } private void UpdateModel() { Model.RideId = Int32.Parse(cboRide.Text); Model.Latitude = Double.Parse(txtLatitude.Text); Model.Longitude = Double.Parse(txtLongitude.Text); } } Base form MvpForm: public class MvpForm<TPresenter, TModel> : Form, IView where TPresenter : class, IPresenter where TModel : class, new() { private readonly TPresenter presenter; private TModel model; public MvpForm() { } public MvpForm(TPresenter presenter) { this.presenter = presenter; this.presenter.RegisterView(this); } protected override void OnLoad(EventArgs e) { base.OnLoad(e); if (presenter != null) presenter.IntializeView(); } public TModel Model { get { if (model == null) throw new InvalidOperationException("The Model property is currently null, however it should have been automatically initialized by the presenter. This most likely indicates that no presenter was bound to the control. Check your presenter bindings."); return model; } set { model = value;} } }

    Read the article

  • How to compile a C DLL for 64 bit with Visual Studio 2010?

    - by Daren Thomas
    I have a DLL written in C in source code. This is the code for the General Polygon Clipper (in case you are interested). I'm using it in a C# project via the C# wrapper provided on the homepage. This comes with a precompiled DLL. Since switching to a 64bit Development machine with Visual Studio 2010 and Windows 7 64 bit, the application won't run anymore. This is the error I get: An attempt was made to load a program with an incorrect format. This is because of DLLImporting the 32bit gpc.dll, as I have gathered from stuff found on the web. I assume this will all go away if I recompile the DLL to 64bit, but can't for the love of me figure out how to do so. My C skills are basic, in that I can write a C program with the GNU tools, but have no experience with various compilers / processors / IDEs etc. I believe I could port this to C#. By that I mean I trust myself to actually pull it off. But I'd prefer not to, since it is a lot of work that I'd prefer a compiler to do for me ;)

    Read the article

  • How do I find and open a file in a Visual Studio 2005 add-in?

    - by Charles Randall
    I'm making an add-in with Visual Studio 2005 C# to help easily toggle between source and header files, as well as script files that all follow a similar naming structure. However, the directory structure has all of the files in different places, even though they are all in the same project. I've got almost all the pieces in place, but I can't figure out how to find and open a file in the solution based only on the file name alone. So I know I'm coming from, say, c:\code\project\subproject\src\blah.cpp, and I want to open c:\code\project\subproject\inc\blah.h, but I don't necessarily know where blah.h is. I could hardcode different directory paths but then the utility isn't generic enough to be robust. The solution has multiple projects, which seems to be a bit of a pain as well. I'm thinking at this point that I'll have to iterate through every project, and iterate through every project item, to see if the particular file is there, and then get a proper reference to it. But it seems to me there must be an easier way of doing this.

    Read the article

  • What to use for version control with Visual Studio 2008 for inhouse projects?

    - by Boog
    We want to put a number of our in-house projects under version control. Our projects are C# .NET applications and assemblies. We originally decided to go Microsoft all the way (as is the norm around here), and tried installing Visual Studio Team Foundation Server. To say the least, it was way more trouble of trying to get a successful install than it's worth, and I'm afraid we don't have a entire server to dedicate to TFS itself, as the installer seems to insist. We also considered a generic solution like Subversion or CVS, or possibly some kind of free online hosting that doesn't make our source publicly available under some license like Google Code appears to. Does anyone have any suggestions for us that would best fit our in-house MS environment? We'd also get some benefit out of some kind of project management tools if they were nicely integrated in to the solution, but this would only be a perk. I should also mention that we haven't entirely ruled out TFS, but it's looking like a pain, so anything you guys have to say for or against it would be helpful.

    Read the article

  • Visual Studio 2008 Unit test does not pick up code changes unless I build the entire solution

    - by Orion Edwards
    Here's the scenario: Change my code: Change my unit test for that code With the cursor inside the unit test class/method, invoke VS2008's "Run tests in current context" command The visual studio "Output" window indicates that the code dll and the test dll both successfully build (in that order) The problem is however, that the unit test does not use the latest version of the dll which it has just built. Instead, it uses the previously built dll (which doesn't have the updated code in it), so the test fails. When adding a new method, this results in a MethodNotImplementedException, and when adding a class, it results in a TypeLoadException, both because the unit test thinks the new code is there, and it isn't!. If I'm just updating an existing method, then the test just fails due to incorrect results. I can 'work around' the problem by doing this Change my code: Change my unit test for that code Invoke VS2008's 'Build Solution' command With the cursor inside the unit test class/method, invoke VS2008's "Run tests in current context" command The problem is that doing a full build solution (even though nothing has changed) takes upwards of 30 seconds, as I have approx 50 C# projects, and VS2008 is not smart enough to realize that only 2 of them need to be looked at. Having to wait 30 seconds just to change 1 line of code and re-run a unit test is abysmal. Is there anything I can do to fix this? None of my code is in the GAC or anything funny like that, it's just ordinary old dll's (buiding against .NET 3.5SP1 on a win7/64bit machine) Please help!

    Read the article

  • How good idea is it to use code contracts in Visual Studio 2010 Professional (ie. no static checking

    - by Lasse V. Karlsen
    I create class libraries, some which are used by others around the world, and now that I'm starting to use Visual Studio 2010 I'm wondering how good idea it is for me to switch to using code contracts, instead of regular old-style if-statements. ie. instead of this: if (String.IsNullOrWhiteSpace(fileName)) throw new ArgumentNullException("fileName"); (yes, I know, if it is whitespace, it isn't strictly null) use this: Contract.Requires(!String.IsNullOrWhiteSpace(fileName)); The reason I'm asking is that I know that the static checker is not available to me, so I'm a bit nervous about some assumptions that I make, that the compiler cannot verify. This might lead to the class library not compiling for someone that downloads it, when they have the static checker. This, coupled with the fact that I cannot even reproduce the problem, would make it tiresome to fix, and I would gather that it doesn't speak volumes to the quality of my class library if it seemingly doesn't even compile out of the box. So I have a few questions: Is the static checker on by default if you have access to it? Or is there a setting I need to switch on in the class library (and since I don't have the static checker, I won't) Are my fears unwarranted? Is the above scenario a real problem? Any advice would be welcome.

    Read the article

  • How to save enum settings in Visual Studio project properties?

    - by zaidwaqi
    Hi, In the Settings tab in Visual Studio, I can see Name, Type, Scope, Value table. Define settings is intuitive if the data type is already within the Type drop-down list i.e. integer, string, long etc. But I can't find enum anywhere. How do I save enum settings then? For now, I have the following which clutter my code too much. public enum Action { LOCK = 9, FORCED_LOGOFF = 12, SHUTDOWN = 14, REBOOT, LOGOFF = FORCED_LOGOFF }; and I define Action as int in the setting. Then I have to do, switch (Properties.Settings.Default.Action) { case 9: SetAction(Action.LOCK); break; case 12: SetAction(Action.FORCED_LOGOFF); break; case 14: SetAction(Action.SHUTDOWN); break; case 15: SetAction(Action.REBOOT); break; default: SetAction(Action.LOCK); break; } Would be nice if I could simply do something like SetAction(Properties.Settings.Default.Action); to replace all above but I dont know how to save enum in setting. Hope my question is clear. Thanks.

    Read the article

  • zend studio 5.5.1 on windows xp - won't open!

    - by esther h
    I have been using zend studio 5.5.1 for the last year and and a half (on windows xp), with some occasional issues, such as a blank error dialog box when I started the program, but always went away when i restarted my computer, and usually got messages about javaw.exe errors. But now, the program does not open at all. What happens is, I get a little dialog that says Loading project - this is normal... but then, nothing. The zend program item is sitting in the taskbar, but when i click on it - nothing! there is nothing to show. i can right click and press close, but that is all. restarting computer did not help. i just uninstalled it, downloaded again from zend website, and reinstalled. tried opening - get loading box, seems to have loaded, even got tip of the day box. but there is nothing showing behind them. once i closed the tip box, i dont have any indication that zend is open besides the program item in the taskbar. windows task manager says it is running... anyone have a clue? help!!! thanks

    Read the article

  • Can Visual Studio (should it be able to) compute a diff between any two changesets associated with a

    - by Hamish Grubijan
    Here is my use case: I start on a project XYZ, for which I create a work item, and I make frequent check-ins, easily 10-20 in total. ALL of the code changes will be code-read and code-reviewed. The change sets are not consecutive - other people check-in in-between my changes, although they are very unlikely to touch the exact same files. So ... at the en of the project I am interested in a "total diff" - as if there was a single check-in by me to complete the entire project. In theory this is computable. From the list of change sets associated with the work item, you get the list of all files that were affected. Then, the algorithm can aggregate individual diffs over each file and combine them into one. It is possible that a pure total diff is uncomputable due to the fact that someone else renamed files, or changed stuff around very closely, or in the same functions as me. I that case ... I suppose a total diff can include those changes by non-me as well, and warn me about the fact. I would find this very useful, but I do not know how to do t in practice. Can Visual Studio 2008/2010 (and/or TFS server) do it? Are there other source control systems capable of doing this? Thanks.

    Read the article

  • The Windows Store... why did I sign up with this mess again?

    - by FransBouma
    Yesterday, Microsoft revealed that the Windows Store is now open to all developers in a wide range of countries and locations. For the people who think "wtf is the 'Windows Store'?", it's the central place where Windows 8 users will be able to find, download and purchase applications (or as we now have to say to not look like a computer illiterate: <accent style="Kentucky">aaaaappss</accent>) for Windows 8. As this is the store which is integrated into Windows 8, it's an interesting place for ISVs, as potential customers might very well look there first. This of course isn't true for all kinds of software, and developer tools in general aren't the kind of applications most users will download from the Windows store, but a presence there can't hurt. Now, this Windows Store hosts two kinds of applications: 'Metro-style' applications and 'Desktop' applications. The 'Metro-style' applications are applications created for the new 'Metro' UI which is present on Windows 8 desktop and Windows RT (the single color/big font fingerpaint-oriented UI). 'Desktop' applications are the applications we all run and use on Windows today. Our software are desktop applications. The Windows Store hosts all Metro-style applications locally in the store and handles the payment for these applications. This means you upload your application (sorry, 'app') to the store, jump through a lot of hoops, Microsoft verifies that your application is not violating a tremendous long list of rules and after everything is OK, it's published and hopefully you get customers and thus earn money. Money which Microsoft will pay you on a regular basis after customers buy your application. Desktop applications are not following this path however. Desktop applications aren't hosted by the Windows Store. Instead, the Windows Store more or less hosts a page with the application's information and where to get the goods. I.o.w.: it's nothing more than a product's Facebook page. Microsoft will simply redirect a visitor of the Windows Store to your website and the visitor will then use your site's system to purchase and download the application. This last bit of information is very important. So, this morning I started with fresh energy to register our company 'Solutions Design bv' at the Windows Store and our two applications, LLBLGen Pro and ORM Profiler. First I went to the Windows Store dashboard page. If you don't have an account, you have to log in or sign up if you don't have a live account. I signed in with my live account. After that, it greeted me with a page where I had to fill in a code which was mailed to me. My local mail server polls every several minutes for email so I had to kick it to get it immediately. I grabbed the code from the email and I was presented with a multi-step process to register myself as a company or as an individual. In red I was warned that this choice was permanent and not changeable. I chuckled: Microsoft apparently stores its data on paper, not in digital form. I chose 'company' and was presented with a lengthy form to fill out. On the form there were two strange remarks: Per company there can just be 1 (one, uno, not zero, not two or more) registered developer, and only that developer is able to upload stuff to the store. I have no idea how this works with large companies, oh the overhead nightmares... "Sorry, but John, our registered developer with the Windows Store is on holiday for 3 months, backpacking through Australia, no, he's not reachable at this point. M'yeah, sorry bud. Hey, did you fill in those TPS reports yesterday?" A separate Approver has to be specified, which has to be a different person than the registered developer. Apparently to Microsoft a company with just 1 person is not a company. Luckily we're with two people! *pfew*, dodged that one, otherwise I would be stuck forever: the choice I already made was not reversible! After I had filled out the form and it was all well and good and accepted by the Microsoft lackey who had to write it all down in some paper notebook ("Hey, be warned! It's a permanent choice! Written down in ink, can't be changed!"), I was presented with the question how I wanted to pay for all this. "Pay for what?" I wondered. Must be the paper they were scribbling the information on, I concluded. After all, there's a financial crisis going on! How could I forget! Silly me. "Ok fair enough". The price was 75 Euros, not the end of the world. I could only pay by credit card, so it was accepted quickly. Or so I thought. You see, Microsoft has a different idea about CC payments. In the normal world, you type in your CC number, some date, a name and a security code and that's it. But Microsoft wants to verify this even more. They want to make a verification purchase of a very small amount and are doing that with a special code in the description. You then have to type in that code in a special form in the Windows Store dashboard and after that you're verified. Of course they'll refund the small amount they pull from your card. Sounds simple, right? Well... no. The problem starts with the fact that I can't see the CC activity on some website: I have a bank issued CC card. I get the CC activity once a month on a piece of paper sent to me. The bank's online website doesn't show them. So it's possible I have to wait for this code till October 12th. One month. "So what, I'm not going to use it anyway, Desktop applications don't use the payment system", I thought. "Haha, you're so naive, dear developer!" Microsoft won't allow you to publish any applications till this verification is done. So no application publishing for a month. Wouldn't it be nice if things were, you know, digital, so things got done instantly? But of course, that lackey who scribbled everything in the Big Windows Store Registration Book isn't that quick. Can't blame him though. He's just doing his job. Now, after the payment was done, I was presented with a page which tells me Microsoft is going to use a third party company called 'Symantec', which will verify my identity again. The page explains to me that this could be done through email or phone and that they'll contact the Approver to verify my identity. "Phone?", I thought... that's a little drastic for a developer account to publish a single page of information about an external hosted software product, isn't it? On Facebook I just added a page, done. And paying you, Microsoft, took less information: you were happy to take my money before my identity was even 'verified' by this 3rd party's minions! "Double standards!", I roared. No-one cared. But it's the thought of getting it off your chest, you know. Luckily for me, everyone at Symantec was asleep when I was registering so they went for the fallback option in case phone calls were not possible: my Approver received an email. Imagine you have to explain the idiot web of security theater I was caught in to someone else who then has to reply a random person over the internet that I indeed was who I said I was. As she's a true sweetheart, she gave me the benefit of the doubt and assured that for now, I was who I said I was. Remember, this is for a desktop application, which is only a link to a website, some pictures and a piece of text. No file hosting, no payment processing, nothing, just a single page. Yeah, I also thought I was crazy. But we're not at the end of this quest yet. I clicked around in the confusing menus of the Windows Store dashboard and found the 'Desktop' section. I get a helpful screen with a warning in red that it can't find any certified 'apps'. True, I'm just getting started, buddy. I see a link: "Check the Windows apps you submitted for certification". Well, I haven't submitted anything, but let's see where it brings me. Oh the thrill of adventure! I click the link and I end up on this site: the hardware/desktop dashboard account registration. "Erm... but I just registered...", I mumbled to no-one in particular. Apparently for desktop registration / verification I have to register again, it tells me. But not only that, the desktop application has to be signed with a certificate. And not just some random el-cheapo certificate you can get at any mall's discount store. No, this certificate is special. It's precious. This certificate, the 'Microsoft Authenticode' Digital Certificate, is the only certificate that's acceptable, and jolly, it can be purchased from VeriSign for the price of only ... $99.-, but be quick, because this is a limited time offer! After that it's, I kid you not, $499.-. 500 dollars for a certificate to sign an executable. But, I do feel special, I got a special price. Only for me! I'm glowing. Not for long though. Here I started to wonder, what the benefit of it all was. I now again had to pay money for a shiny certificate which will add 'Solutions Design bv' to our installer as the publisher instead of 'unknown', while our customers download the file from our website. Not only that, but this was all about a Desktop application, which wasn't hosted by Microsoft. They only link to it. And make no mistake. These prices aren't single payments. Every year these have to be renewed. Like a membership of an exclusive club: you're special and privileged, but only if you cough up the dough. To give you an example how silly this all is: I added LLBLGen Pro and ORM Profiler to the Visual Studio Gallery some time ago. It's the same thing: it's a central place where one can find software which adds to / extends / works with Visual Studio. I could simply create the pages, add the information and they show up inside Visual Studio. No files are hosted at Microsoft, they're downloaded from our website. Exactly the same system. As I have to wait for the CC transcripts to arrive anyway, I can't proceed with publishing in this new shiny store. After the verification is complete I have to wait for verification of my software by Microsoft. Even Desktop applications need to be verified using a long list of rules which are mainly focused on Metro-style applications. Even while they're not hosted by Microsoft. I wonder what they'll find. "Your application wasn't approved. It violates rule 14 X sub D: it provides more value than our own competing framework". While I was writing this post, I tried to check something in the Windows Store Dashboard, to see whether I remembered it correctly. I was presented again with the question, after logging in with my live account, to enter the code that was just mailed to me. Not the previous code, a brand new one. Again I had to kick my mail server to pull the email to proceed. This was it. This 'experience' is so beyond miserable, I'm afraid I have to say goodbye for now to the 'Windows Store'. It's simply not worth my time. Now, about live accounts. You might know this: live accounts are tied to everything you do with Microsoft. So if you have an MSDN subscription, e.g. the one which costs over $5000.-, it's tied to this same live account. But the fun thing is, you can login with your live account to the MSDN subscriptions with just the account id and password. No additional code is mailed to you. While it gives you access to all Microsoft software available, including your licenses. Why the draconian security theater with this Windows Store, while all I want is to publish some desktop applications while on other Microsoft sites it's OK to simply sign in with your live account: no codes needed, no verification and no certificates? Microsoft, one thing you need with this store and that's: apps. Apps, apps, apps, apps, aaaaaaaaapps. Sorry, my bad, got carried away. I just can't stand the word 'app'. This store's shelves have to be filled to the brim with goods. But instead of being welcomed into the store with open arms, I have to fight an uphill battle with an endless list of rules and bullshit to earn the privilege to publish in this shiny store. As if I have to be thrilled to be one of the exclusive club called 'Windows Store Publishers'. As if Microsoft doesn't want it to succeed. Craig Stuntz sent me a link to an old blog post of his regarding code signing and uploading to Microsoft's old mobile store from back in the WinMo5 days: http://blogs.teamb.com/craigstuntz/2006/10/11/28357/. Good read and good background info about how little things changed over the years. I hope this helps Microsoft make things more clearer and smoother and also helps ISVs with their decision whether to go with the Windows Store scheme or ignore it. For now, I don't see the advantage of publishing there, especially not with the nonsense rules Microsoft cooked up. Perhaps it changes in the future, who knows.

    Read the article

  • Installing Lubuntu 14.04.1 forcepae fails

    - by Rantanplan
    I tried to install Lubuntu 14.04.1 from a CD. First, I chose Try Lubuntu without installing which gave: ERROR: PAE is disabled on this Pentium M (PAE can potentially be enabled with kernel parameter "forcepae" ... Following the description on https://help.ubuntu.com/community/PAE, I used forcepae and tried Try Lubuntu without installing again. That worked fine. dmesg | grep -i pae showed: [ 0.000000] Kernel command line: file=/cdrom/preseed/lubuntu.seed boot=casper initrd=/casper/initrd.lz quiet splash -- forcepae [ 0.008118] PAE forced! On the live-CD session, I tried installing Lubuntu double clicking on the install button on the desktop. Here, the CD starts running but then stops running and nothing happens. Next, I rebooted and tried installing Lubuntu directly from the boot menu screen using forcepae again. After a while, I receive the following error message: The installer encountered an unrecoverable error. A desktop session will now be run so that you may investigate the problem or try installing again. Hitting Enter brings me to the desktop. For what errors should I search? And how? Finally, I rebooted once more and tried Check disc for defects with forcepae option; no errors have been found. Now, I am wondering how to find the error or whether it would be better to follow advice c in https://help.ubuntu.com/community/PAE: "Move the hard disk to a computer on which the processor has PAE capability and PAE flag (that is, almost everything else than a Banias). Install the system as usual but don't add restricted drivers. After the install move the disk back." Thanks for some hints! Perhaps some of the following can help: On Lubuntu 12.04: cat /proc/cpuinfo processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 13 model name : Intel(R) Pentium(R) M processor 1.50GHz stepping : 6 microcode : 0x17 cpu MHz : 600.000 cache size : 2048 KB fdiv_bug : no hlt_bug : no f00f_bug : no coma_bug : no fpu : yes fpu_exception : yes cpuid level : 2 wp : yes flags : fpu vme de pse tsc msr mce cx8 mtrr pge mca cmov clflush dts acpi mmx fxsr sse sse2 ss tm pbe up bts est tm2 bogomips : 1284.76 clflush size : 64 cache_alignment : 64 address sizes : 32 bits physical, 32 bits virtual power management: uname -a Linux humboldt 3.2.0-67-generic #101-Ubuntu SMP Tue Jul 15 17:45:51 UTC 2014 i686 i686 i386 GNU/Linux lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 12.04.5 LTS Release: 12.04 Codename: precise cpuid eax in eax ebx ecx edx 00000000 00000002 756e6547 6c65746e 49656e69 00000001 000006d6 00000816 00000180 afe9f9bf 00000002 02b3b001 000000f0 00000000 2c04307d 80000000 80000004 00000000 00000000 00000000 80000001 00000000 00000000 00000000 00000000 80000002 20202020 20202020 65746e49 2952286c 80000003 6e655020 6d756974 20295228 7270204d 80000004 7365636f 20726f73 30352e31 007a4847 Vendor ID: "GenuineIntel"; CPUID level 2 Intel-specific functions: Version 000006d6: Type 0 - Original OEM Family 6 - Pentium Pro Model 13 - Stepping 6 Reserved 0 Brand index: 22 [not in table] Extended brand string: " Intel(R) Pentium(R) M processor 1.50GHz" CLFLUSH instruction cache line size: 8 Feature flags afe9f9bf: FPU Floating Point Unit VME Virtual 8086 Mode Enhancements DE Debugging Extensions PSE Page Size Extensions TSC Time Stamp Counter MSR Model Specific Registers MCE Machine Check Exception CX8 COMPXCHG8B Instruction SEP Fast System Call MTRR Memory Type Range Registers PGE PTE Global Flag MCA Machine Check Architecture CMOV Conditional Move and Compare Instructions FGPAT Page Attribute Table CLFSH CFLUSH instruction DS Debug store ACPI Thermal Monitor and Clock Ctrl MMX MMX instruction set FXSR Fast FP/MMX Streaming SIMD Extensions save/restore SSE Streaming SIMD Extensions instruction set SSE2 SSE2 extensions SS Self Snoop TM Thermal monitor 31 reserved TLB and cache info: b0: unknown TLB/cache descriptor b3: unknown TLB/cache descriptor 02: Instruction TLB: 4MB pages, 4-way set assoc, 2 entries f0: unknown TLB/cache descriptor 7d: unknown TLB/cache descriptor 30: unknown TLB/cache descriptor 04: Data TLB: 4MB pages, 4-way set assoc, 8 entries 2c: unknown TLB/cache descriptor On Lubuntu 14.04.1 live-CD with forcepae: cat /proc/cpuinfo processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 13 model name : Intel(R) Pentium(R) M processor 1.50GHz stepping : 6 microcode : 0x17 cpu MHz : 600.000 cache size : 2048 KB physical id : 0 siblings : 1 core id : 0 cpu cores : 1 apicid : 0 initial apicid : 0 fdiv_bug : no f00f_bug : no coma_bug : no fpu : yes fpu_exception : yes cpuid level : 2 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 sep mtrr pge mca cmov clflush dts acpi mmx fxsr sse sse2 ss tm pbe bts est tm2 bogomips : 1284.68 clflush size : 64 cache_alignment : 64 address sizes : 36 bits physical, 32 bits virtual power management: uname -a Linux lubuntu 3.13.0-32-generic #57-Ubuntu SMP Tue Jul 15 03:51:12 UTC 2014 i686 i686 i686 GNU/Linux lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 14.04.1 LTS Release: 14.04 Codename: trusty cpuid CPU 0: vendor_id = "GenuineIntel" version information (1/eax): processor type = primary processor (0) family = Intel Pentium Pro/II/III/Celeron/Core/Core 2/Atom, AMD Athlon/Duron, Cyrix M2, VIA C3 (6) model = 0xd (13) stepping id = 0x6 (6) extended family = 0x0 (0) extended model = 0x0 (0) (simple synth) = Intel Pentium M (Dothan B1) / Celeron M (Dothan B1), 90nm miscellaneous (1/ebx): process local APIC physical ID = 0x0 (0) cpu count = 0x0 (0) CLFLUSH line size = 0x8 (8) brand index = 0x16 (22) brand id = 0x16 (22): Intel Pentium M, .13um feature information (1/edx): x87 FPU on chip = true virtual-8086 mode enhancement = true debugging extensions = true page size extensions = true time stamp counter = true RDMSR and WRMSR support = true physical address extensions = false machine check exception = true CMPXCHG8B inst. = true APIC on chip = false SYSENTER and SYSEXIT = true memory type range registers = true PTE global bit = true machine check architecture = true conditional move/compare instruction = true page attribute table = true page size extension = false processor serial number = false CLFLUSH instruction = true debug store = true thermal monitor and clock ctrl = true MMX Technology = true FXSAVE/FXRSTOR = true SSE extensions = true SSE2 extensions = true self snoop = true hyper-threading / multi-core supported = false therm. monitor = true IA64 = false pending break event = true feature information (1/ecx): PNI/SSE3: Prescott New Instructions = false PCLMULDQ instruction = false 64-bit debug store = false MONITOR/MWAIT = false CPL-qualified debug store = false VMX: virtual machine extensions = false SMX: safer mode extensions = false Enhanced Intel SpeedStep Technology = true thermal monitor 2 = true SSSE3 extensions = false context ID: adaptive or shared L1 data = false FMA instruction = false CMPXCHG16B instruction = false xTPR disable = false perfmon and debug = false process context identifiers = false direct cache access = false SSE4.1 extensions = false SSE4.2 extensions = false extended xAPIC support = false MOVBE instruction = false POPCNT instruction = false time stamp counter deadline = false AES instruction = false XSAVE/XSTOR states = false OS-enabled XSAVE/XSTOR = false AVX: advanced vector extensions = false F16C half-precision convert instruction = false RDRAND instruction = false hypervisor guest status = false cache and TLB information (2): 0xb0: instruction TLB: 4K, 4-way, 128 entries 0xb3: data TLB: 4K, 4-way, 128 entries 0x02: instruction TLB: 4M pages, 4-way, 2 entries 0xf0: 64 byte prefetching 0x7d: L2 cache: 2M, 8-way, sectored, 64 byte lines 0x30: L1 cache: 32K, 8-way, 64 byte lines 0x04: data TLB: 4M pages, 4-way, 8 entries 0x2c: L1 data cache: 32K, 8-way, 64 byte lines extended feature flags (0x80000001/edx): SYSCALL and SYSRET instructions = false execution disable = false 1-GB large page support = false RDTSCP = false 64-bit extensions technology available = false Intel feature flags (0x80000001/ecx): LAHF/SAHF supported in 64-bit mode = false LZCNT advanced bit manipulation = false 3DNow! PREFETCH/PREFETCHW instructions = false brand = " Intel(R) Pentium(R) M processor 1.50GHz" (multi-processing synth): none (multi-processing method): Intel leaf 1 (synth) = Intel Pentium M (Dothan B1), 90nm

    Read the article

  • DDD East Anglia, 29th June 2013 - Async Patterns presentation and source code

    - by Liam Westley
    Originally posted on: http://geekswithblogs.net/twickers/archive/2013/07/01/ddd-east-anglia-29th-june-2013---async-patterns-presentation.aspxMany thanks to the team in Cambridge for an awesome first conference DDD East Anglia.  I definitely appreciate how each of the different areas have their own distinctive atmosphere and feel.  Thanks to some great sponsors we enjoyed a great venue and some excellent nibbles. For those who attended my Async my source code and presentation are available on GitHub, https://github.com/westleyl/DDDEastAnglia2013-Async.git If you are new to Git then the easiest client to install is GitHub for Windows, a graphical UI for accessing GitHub. Personally, I also have Git Extensions and Tortoise Git installed. Tortoise Git is the file explorer add-in that works in a familiar manner to TortoiseSVN. As I mentioned during the presentation I have not included the sample data, the music files, in the source code placed on GitHub but I have included instructions on how to download them from http://silents.bandcamp.comand place them in the correct folders. Also, Windows Media Player, by default, does not play Ogg Vorbis and Flac music files, however you can download the codec installer for these, for free, from http://xiph.org/dshow. I have included the .Net 4.0 version of the source code that uses the Microsoft.Bcl.Async NuGet package - once you have got the project from GitHub you will need to install this NuGet package for the code to compile. Load Project into Visual Studio 2012 Access the NuGet package manager (Tools -> Library Package Manager -> Manage NuGet Packages For Solution) Highlight Online and then Search Online for microsoft.bcl.async Click on Install button Resources : You can download the Task-based Asynchronous Pattern white paper by Stephen Toub, which was the inspiration for this presentation from here - http://www.microsoft.com/en-us/download/details.aspx?id=19957 Presentation : If you just want the presentation and don’t want to bother with a GitHub login you can download the PowerPoint presentation from here.

    Read the article

  • How to make a Custom Data Generator for SQL XML DataType.

    - by Keith Sirmons
    Howdy, I am using Visual Studio 2010 and am playing around with the Database Projects. I am creating a DataGenerationPlan to insert data into a simple table, in which one of the column datatypes is XML. Out of the box, the generation plan uses the Regular Expression generator and generates something like this : HGcSv9wa7yM44T9x5oFT4pmBkEmv62lJ7OyAmCnL6yqXC2X.......... I am looking at creating a custom data Generator for this data type and have followed this site for the basics: http://msdn.microsoft.com/en-us/library/aa833244.aspx This example works if I am creating a string datatype and using it for a nvarchar datatype. What do I need to change to hook this Generator to the XML Datatype? Below are my code files. The string property works for nvarchar. The XElement property does not work for the xml datatype, and the RecordXMLDataGenerator is not listed as an option in the Generator column for the generation plan. CustomDataGenerators: using System; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.Data.Schema.Tools.DataGenerator; using Microsoft.Data.Schema.Extensibility; using Microsoft.Data.Schema; using Microsoft.Data.Schema.Sql; using System.Xml.Linq; namespace CustomDataGenerators { [DatabaseSchemaProviderCompatibility(typeof(SqlDatabaseSchemaProvider))] public class RecordXMLDataGenerator : Generator { private XElement _RecordData; [Output(Description = "Generates string of XML Data for the Record.", Name = "RecordDataString")] public string RecordDataString { get { return _RecordData.ToString(SaveOptions.None); } } [Output(Description = "Generates XML Data for the Record.", Name = "RecordData")] public XElement RecordData { get { return _RecordData; } } protected override void OnGenerateNextValues() { base.OnGenerateNextValues(); XElement element = new XElement("Root", new XElement("Children1", 1), new XElement("Children6", 6) ); _RecordData = element; } } } XML Extensions File: <?xml version="1.0" encoding="utf-8" ?> <extensions assembly="" version="1" xmlns="urn:Microsoft.Data.Schema.Extensions" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="urn:Microsoft.Data.Schema.Extensions Microsoft.Data.Schema.Extensions.xsd"> <extension type="CustomDataGenerators.RecordXMLDataGenerator" assembly="CustomDataGenerators, Version=1.0.0.0, Culture=neutral, PublicKeyToken=xxxxxxxxxxxx" enabled="true"/> </extensions> Table.sql: CREATE TABLE [dbo].[Record] ( id int IDENTITY (1,1) NOT NULL, recordData xml NULL, userId int NULL, test nvarchar(max) NULL, rowver rowversion NULL, CONSTRAINT pk_RecordID PRIMARY KEY (id) )

    Read the article

  • OpenCV 2.4.2 on Matlab 2012b (Windows 7)

    - by Maik Xhani
    Hello i am trying to use OpenCV 2.4.2 in Matlab 2012b. I have tried those actions: downloaded OpenCV 2.4.2 used CMake on opencv folder using Visual Studio 10 and Visual Studio 10 Win64 compiler built Debug and Release version with Visual Studio first without any other option and then with D_SCL_SECURE=1 specified for every project changed Matlab's mexopts.bat and adding new lines refering to library and include (see bottom for mexopts.bat content) with Visual Studio 10 compiler tried to compile a simple file with a OpenCV library inclusion and all goes well. try to compile something that actually uses OpenCV commands and get errors. I used openmexopencv library and when tried to compile something i get this error cv.make mex -largeArrayDims -D_SECURE_SCL=1 -Iinclude -I"C:\OpenCV\build\include" -L"C:\OpenCV\build\x64\vc10\lib" -lopencv_calib3d242 -lopencv_contrib242 -lopencv_core242 -lopencv_features2d242 -lopencv_flann242 -lopencv_gpu242 -lopencv_haartraining_engine -lopencv_highgui242 -lopencv_imgproc242 -lopencv_legacy242 -lopencv_ml242 -lopencv_nonfree242 -lopencv_objdetect242 -lopencv_photo242 -lopencv_stitching242 -lopencv_ts242 -lopencv_video242 -lopencv_videostab242 src+cv\CamShift.cpp lib\MxArray.obj -output +cv\CamShift CamShift.cpp C:\Program Files\MATLAB\R2012b\extern\include\tmwtypes.h(821) : warning C4091: 'typedef ': ignorato a sinistra di 'wchar_t' quando non si dichiara alcuna variabile c:\program files\matlab\r2012b\extern\include\matrix.h(319) : error C4430: identificatore di tipo mancante, verr… utilizzato int. Nota: default-int non Š pi— supportato in C++ the content of my mexopts.bat is @echo off rem MSVC100OPTS.BAT rem rem Compile and link options used for building MEX-files rem using the Microsoft Visual C++ compiler version 10.0 rem rem $Revision: 1.1.6.4.2.1 $ $Date: 2012/07/12 13:53:59 $ rem Copyright 2007-2009 The MathWorks, Inc. rem rem StorageVersion: 1.0 rem C++keyFileName: MSVC100OPTS.BAT rem C++keyName: Microsoft Visual C++ 2010 rem C++keyManufacturer: Microsoft rem C++keyVersion: 10.0 rem C++keyLanguage: C++ rem C++keyLinkerName: Microsoft Visual C++ 2010 rem C++keyLinkerVersion: 10.0 rem rem ******************************************************************** rem General parameters rem ******************************************************************** set MATLAB=%MATLAB% set VSINSTALLDIR=c:\Program Files (x86)\Microsoft Visual Studio 10.0 set VCINSTALLDIR=%VSINSTALLDIR%\VC set OPENCVDIR=C:\OpenCV rem In this case, LINKERDIR is being used to specify the location of the SDK set LINKERDIR=c:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\ set PATH=%VCINSTALLDIR%\bin\amd64;%VCINSTALLDIR%\bin;%VCINSTALLDIR%\VCPackages;%VSINSTALLDIR%\Common7\IDE;%VSINSTALLDIR%\Common7\Tools;%LINKERDIR%\bin\x64;%LINKERDIR%\bin;%MATLAB_BIN%;%PATH% set INCLUDE=%OPENCVDIR%\build\include;%VCINSTALLDIR%\INCLUDE;%VCINSTALLDIR%\ATLMFC\INCLUDE;%LINKERDIR%\include;%INCLUDE% set LIB=%OPENCVDIR%\build\x64\vc10\lib;%VCINSTALLDIR%\LIB\amd64;%VCINSTALLDIR%\ATLMFC\LIB\amd64;%LINKERDIR%\lib\x64;%MATLAB%\extern\lib\win64;%LIB% set MW_TARGET_ARCH=win64 rem ******************************************************************** rem Compiler parameters rem ******************************************************************** set COMPILER=cl set COMPFLAGS=/c /GR /W3 /EHs /D_CRT_SECURE_NO_DEPRECATE /D_SCL_SECURE_NO_DEPRECATE /D_SECURE_SCL=0 /DMATLAB_MEX_FILE /nologo /MD set OPTIMFLAGS=/O2 /Oy- /DNDEBUG set DEBUGFLAGS=/Z7 set NAME_OBJECT=/Fo rem ******************************************************************** rem Linker parameters rem ******************************************************************** set LIBLOC=%MATLAB%\extern\lib\win64\microsoft set LINKER=link set LINKFLAGS=/dll /export:%ENTRYPOINT% /LIBPATH:"%LIBLOC%" libmx.lib libmex.lib libmat.lib /MACHINE:X64 kernel32.lib user32.lib gdi32.lib winspool.lib comdlg32.lib advapi32.lib shell32.lib ole32.lib oleaut32.lib uuid.lib odbc32.lib odbccp32.lib opencv_calib3d242.lib opencv_contrib242.lib opencv_core242.lib opencv_features2d242.lib opencv_flann242.lib opencv_gpu242.lib opencv_haartraining_engine.lib opencv_imgproc242.lib opencv_highgui242.lib opencv_legacy242.lib opencv_ml242.lib opencv_nonfree242.lib opencv_objdetect242.lib opencv_photo242.lib opencv_stitching242.lib opencv_ts242.lib opencv_video242.lib opencv_videostab242.lib /nologo /manifest /incremental:NO /implib:"%LIB_NAME%.x" /MAP:"%OUTDIR%%MEX_NAME%%MEX_EXT%.map" set LINKOPTIMFLAGS= set LINKDEBUGFLAGS=/debug /PDB:"%OUTDIR%%MEX_NAME%%MEX_EXT%.pdb" set LINK_FILE= set LINK_LIB= set NAME_OUTPUT=/out:"%OUTDIR%%MEX_NAME%%MEX_EXT%" set RSP_FILE_INDICATOR=@ rem ******************************************************************** rem Resource compiler parameters rem ******************************************************************** set RC_COMPILER=rc /fo "%OUTDIR%mexversion.res" set RC_LINKER= set POSTLINK_CMDS=del "%LIB_NAME%.x" "%LIB_NAME%.exp" set POSTLINK_CMDS1=mt -outputresource:"%OUTDIR%%MEX_NAME%%MEX_EXT%;2" -manifest "%OUTDIR%%MEX_NAME%%MEX_EXT%.manifest" set POSTLINK_CMDS2=del "%OUTDIR%%MEX_NAME%%MEX_EXT%.manifest" set POSTLINK_CMDS3=del "%OUTDIR%%MEX_NAME%%MEX_EXT%.map"

    Read the article

  • How do you make a Custom Data Generator for SQL XML DataType.

    - by Keith Sirmons
    Howdy, I am using Visual Studio 2010 and am playing around with the Database Projects. I am creating a DataGenerationPlan to insert data into a simple table, in which one of the column datatypes is XML. Out of the box, the generation plan uses the Regular Expression generator and generates something like this : HGcSv9wa7yM44T9x5oFT4pmBkEmv62lJ7OyAmCnL6yqXC2X.......... I am looking at creating a custom data Generator for this data type and have followed this site for the basics: http://msdn.microsoft.com/en-us/library/aa833244.aspx This example works if I am creating a string datatype and using it for a nvarchar datatype. What do I need to change to hook this Generator to the XML Datatype? Below are my code files. The string property works for nvarchar. The XElement property does not work for the xml datatype, and the RecordXMLDataGenerator is not listed as an option in the Generator column for the generation plan. CustomDataGenerators: using System; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.Data.Schema.Tools.DataGenerator; using Microsoft.Data.Schema.Extensibility; using Microsoft.Data.Schema; using Microsoft.Data.Schema.Sql; using System.Xml.Linq; namespace CustomDataGenerators { [DatabaseSchemaProviderCompatibility(typeof(SqlDatabaseSchemaProvider))] public class RecordXMLDataGenerator : Generator { private XElement _RecordData; [Output(Description = "Generates string of XML Data for the Record.", Name = "RecordDataString")] public string RecordDataString { get { return _RecordData.ToString(SaveOptions.None); } } [Output(Description = "Generates XML Data for the Record.", Name = "RecordData")] public XElement RecordData { get { return _RecordData; } } protected override void OnGenerateNextValues() { base.OnGenerateNextValues(); XElement element = new XElement("Root", new XElement("Children1", 1), new XElement("Children6", 6) ); _RecordData = element; } } } XML Extensions File: <?xml version="1.0" encoding="utf-8" ?> <extensions assembly="" version="1" xmlns="urn:Microsoft.Data.Schema.Extensions" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="urn:Microsoft.Data.Schema.Extensions Microsoft.Data.Schema.Extensions.xsd"> <extension type="CustomDataGenerators.RecordXMLDataGenerator" assembly="CustomDataGenerators, Version=1.0.0.0, Culture=neutral, PublicKeyToken=xxxxxxxxxxxx" enabled="true"/> </extensions> Table.sql: CREATE TABLE [dbo].[Record] ( id int IDENTITY (1,1) NOT NULL, recordData xml NULL, userId int NULL, test nvarchar(max) NULL, rowver rowversion NULL, CONSTRAINT pk_RecordID PRIMARY KEY (id) )

    Read the article

< Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >