Search Results

Search found 22177 results on 888 pages for 'dell studio xps 16'.

Page 271/888 | < Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >

  • Windows update breaks dlls?

    - by shoosh
    I'm compiling a project which uses multiple DLL and compiles with VS2008. After a recent windows update DLLs compiled on my computer stopped working on other computers. After some investigation it turned out that it updated the CRT redistributable library which I'm compiling with from version "9.0.21022.8" to version "9.0.30729.4148" This is evident from the Manifest file of the EXE i'm compiling. it contains the following: <dependency> <dependentAssembly> <assemblyIdentity type="win32" name="Microsoft.VC90.CRT" version="9.0.21022.8" processorArchitecture="amd64" publicKeyToken="1fc8b3b9a1e18e3b"></assemblyIdentity> </dependentAssembly> </dependency> <dependency> <dependentAssembly> <assemblyIdentity type="win32" name="Microsoft.VC90.CRT" version="9.0.30729.4148" processorArchitecture="amd64" publicKeyToken="1fc8b3b9a1e18e3b"></assemblyIdentity> </dependentAssembly> </dependency> Meaning it wants to use two different versions of the CRT at the same time. the second version is needed by the code which I'm compiling right now and the first version is needed by older dlls which were compiled a few weeks ago. In the computers where the application is deployed this becomes a problem since they get their CRT dll from a local folder called Microsoft.VC90.CRT and not from WinSXS. This folder can't contain two different versions of the dll. Is there a known solution to this issue or do I need to start compiling all of the other DLLs with the new CRT?

    Read the article

  • CIL and JVM Little endian to big endian in c# and java

    - by Haythem
    Hello, I am using on the client C# where I am converting double values to byte array. I am using java on the server and I am using writeDouble and readDouble to convert double values to byte arrays. The problem is the double values from java at the end are not the double values at the begin giving to c# writeDouble in Java Converts the double argument to a long using the doubleToLongBits method , and then writes that long value to the underlying output stream as an 8-byte quantity, high byte first. DoubleToLongBits Returns a representation of the specified floating-point value according to the IEEE 754 floating-point "double format" bit layout. The Program on the server is waiting of 64-102-112-0-0-0-0-0 from C# to convert it to 1700.0 but he si becoming 0000014415464 from c# after c# converted 1700.0 this is my code in c#: class User { double workingStatus; public void persist() { byte[] dataByte; using (MemoryStream ms = new MemoryStream()) { using (BinaryWriter bw = new BinaryWriter(ms)) { bw.Write(workingStatus); bw.Flush(); bw.Close(); } dataByte = ms.ToArray(); for (int j = 0; j < dataByte.Length; j++) { Console.Write(dataByte[j]); } } public double WorkingStatus { get { return workingStatus; } set { workingStatus = value; } } } class Test { static void Main() { User user = new User(); user.WorkingStatus = 1700.0; user.persist(); } thank you for the help.

    Read the article

  • How do I get source file information with dumpbin /symbols when compiling with VS 2005?

    - by Thomas Dartsch
    I have a tool which uses the output of dumpbin /symbols to do some dependency analysis with our C/C++ libraries. When we compiled the libs with VS 6.0, the dumpbin COFF SYMBOL TABLE contained entries like 000 00000008 DEBUG notype Filename | .file x:\mydir\mysource.c allowing me to get the relationship between sources and defined/used symbols, which is essential for my tool. When we compile with VS 2005, these entries are missing. When I look at the libs with a hex editor, it seems that there is no filename information at all included in the binary files, so it seems not to be a dumbin problem but is compilation related. So I'm looking for a way to get the Filename entries back into my libraries when compiling with VS 2005.

    Read the article

  • Why does the MSCVRT library generate conflicts at link time?

    - by neuviemeporte
    I am building a project in Visual C++ 2008, which is an example MFC-based app for a static C++ class library I will be using in my own project soon. While building the Debug configuration, I get the following: warning LNK4098: defaultlib 'MSVCRT' conflicts with use of other libs; use /NODEFAULTLIB:library After using the recommended option (by adding "msvcrt" to the "Ignore specific library" field in the project linker settings for the Debug configuration), the program links and runs fine. However, I'd like to find out why this conflict occured, why do I have to ignore a critical library, and if I'm to expect problems later I if add the ignore, or what happens if I don't (because the program builds anyway). At the same time, the Release configuration warns: warning LNK4075: ignoring '/EDITANDCONTINUE' due to '/OPT:ICF' specification warning LNK4098: defaultlib 'MSVCRTD' conflicts with use of other libs; use /NODEFAULTLIB:library I'm guessing that the "D" suffix means this is the debug version of the vc++ runtime, no idea why this gets used this time. Anyway, adding "msvcrtd" to the ignore field causes lots of link errors of the form: error LNK2001: unresolved external symbol imp_CrtDbgReportW Any insight greatly appreciated.

    Read the article

  • Should I be relying on WebTests for data validation?

    - by Alexander Kahoun
    I have a suite of web tests created for a web service. I use it for testing a particular input method that updates a SQL Database. The web service doesn't have a way to retrieve the data, that's not its purpose, only to update it. I have a validator that validates the response XML that the web service generates for each request. All that works fine. It was suggested by a teammate that I add data validation so that I check the database to see the data after the initial response validator runs and compare it with what was in the input request. We have a number of services and libraries that are separate from the web service I'm testing that I can use to get the data and compare it. The problem is that when I run the web test the data validation always fails even when the request succeeds. I've tried putting the thread to sleep between the response validation and the data validation but to no avail; It always gets the data from before the response validation. I can set a break point and visually see that the data has been updated in the DB, funny thing is when I step through it in debug with the breakpoint it does validate successfully. Before I get too much more into this issue I have to ask; Is this the purpose of web tests? Should I be able to validate data through service calls in this manner or am I asking too much of a web test and the response validation is as far as I should go?

    Read the article

  • How to tell if two exe's are the same code-wise?

    - by yumcious
    Is there a way to detect whether two EXE's (compiled from VS.Net 2008 for C++/MFC) do not have any code-level changes between them i.e. for purposes of knowing that there have been no statement changes. This is for compliance purposes when my vendor ships me an exe, ostensibly with no changes made to the code since the last time we tested it. Is there a tool to check that this is so? Cheers

    Read the article

  • My ASP.NET Accordion will not animate panel changes when triggered by check boxes.

    - by CowKingDeluxe
    My accordion panel in markup: <ajaxToolkit:Accordion ID="MyAccordion" runat="server" SelectedIndex="0" HeaderCssClass="accordionHeader" HeaderSelectedCssClass="accordionHeaderSelected" ContentCssClass="accordionContent" AutoSize="None" FadeTransitions="true" TransitionDuration="250" FramesPerSecond="40" RequireOpenedPane="false" SuppressHeaderPostbacks="true"> <Panes> <ajaxToolkit:AccordionPane ID="AccordionPane10" runat="server"> <Header>BBBBBBBBBB</Header> <Content> FFFFFFFF:<br /><br /> <table cellpadding="0" cellspacing="0" width="750"><tr><td width="450" class="verificationtdleft"> <asp:Image ID="step4_originalimage" runat="server" AlternateText="" /> </td><td width="300"> <asp:CheckBox ID="CB_Verification0" runat="server" AutoPostBack="true" /> Verify </td></tr> </table> </Content> </ajaxToolkit:AccordionPane> <ajaxToolkit:AccordionPane ID="AccordionPane11" runat="server"> <Header>GGGGGGGGG</Header> <Content> HHHHHHHHHH:<br /><br /> <table cellpadding="0" cellspacing="0" width="750"><tr><td width="450" class="verificationtdleft"> <asp:Image ID="step4_image_thumbnail" runat="server" AlternateText="" /> </td><td width="300"> <asp:CheckBox ID="CB_Verification1" runat="server" AutoPostBack="true" /> Verify </td></tr> </table> </Content> </ajaxToolkit:AccordionPane> </Panes> Here's how I handle the checkbox check: Private Sub CB_Verification0_CheckedChanged(ByVal sender As Object, ByVal e As System.EventArgs) Handles CB_Verification0.CheckedChanged MyAccordion.SelectedIndex = 1 End Sub I'm causing the panels to change correctly, it's just that they don't animate like they do when I click the headers. When I click the checkbox to change the panel, the panel just disappears instantly and the new one appears instantly, but I want it to be animated as if I clicked the headers. Is there a way to cause the animation to happen when force changing the visible panel?

    Read the article

  • VS2008: File creation fails randomly in unit testing?

    - by Tim
    I'm working on implementing a reasonably simple XML serializer/deserializer (log file parser) application in C# .NET with VS 2008. I have about 50 unit tests right now for various parts of the code (mostly for the various serialization operations), and some of them seem to be failing mostly at random when they deal with file I/O. The way the tests are structured is that in the test setup method, I create a new empty file at a certain predetermined location, and close the stream I get back. Then I run some basic tests on the file (varying by what exactly is under test). In the cleanup method, I delete the file again. A large portion (usually 30 or more, though the number varies run to run) of my unit tests will fail at the initialize method, claiming they can't access the file I'm trying to create. I can't pin down the exact reason, since a test that will work one run fails the next; they all succeed when run individually. What's the problem here? Why can't I access this file across multiple unit tests? Relevant methods for a unit test that will fail some of the time: [TestInitialize()] public void LogFileTestInitialize() { this.testFolder = System.Environment.GetFolderPath( System.Environment.SpecialFolder.LocalApplicationData ); this.testPath = this.testFolder + "\\empty.lfp"; System.IO.File.Create(this.testPath); } [TestMethod()] public void LogFileConstructorTest() { string filePath = this.testPath; LogFile target = new LogFile(filePath); Assert.AreNotEqual(null, target); Assert.AreEqual(this.testPath, target.filePath); Assert.AreEqual("empty.lfp", target.fileName); Assert.AreEqual(this.testFolder + "\\empty.lfp.lfpdat", target.metaPath); } [TestCleanup()] public void LogFileTestCleanup() { System.IO.File.Delete(this.testPath); } And the LogFile() constructor: public LogFile(String filePath) { this.entries = new List<Entry>(); this.filePath = filePath; this.metaPath = filePath + ".lfpdat"; this.fileName = filePath.Substring(filePath.LastIndexOf("\\") + 1); } The precise error message: Initialization method LogFileParserTester.LogFileTest.LogFileTestInitialize threw exception. System.IO.IOException: System.IO.IOException: The process cannot access the file 'C:\Users\<user>\AppData\Local\empty.lfp' because it is being used by another process..

    Read the article

  • vsts load test datasource issues

    - by ashish.s
    Hello, I have a simple test using vsts load test that is using datasource. The connection string for the source is as follows <connectionStrings> <add name="MyExcelConn" connectionString="Driver={Microsoft Excel Driver (*.xls)};Dsn=Excel Files;dbq=loginusers.xls;defaultdir=.;driverid=790;maxbuffersize=4096;pagetimeout=20;ReadOnly=False" providerName="System.Data.Odbc" /> </connectionStrings> the datasource configuration is as follows and i am getting following error estError TestError 1,000 The unit test adapter failed to connect to the data source or to read the data. For more information on troubleshooting this error, see "Troubleshooting Data-Driven Unit Tests" (http://go.microsoft.com/fwlink/?LinkId=62412) in the MSDN Library. Error details: ERROR [42000] [Microsoft][ODBC Excel Driver] Cannot update. Database or object is read-only. ERROR [IM006] [Microsoft][ODBC Driver Manager] Driver's SQLSetConnectAttr failed ERROR [42000] [Microsoft][ODBC Excel Driver] Cannot update. Database or object is read-only. I wrote a test, just to check if i could create an odbc connection would work and that works the test is as follows [TestMethod] public void TestExcelFile() { string connString = ConfigurationManager.ConnectionStrings["MyExcelConn"].ConnectionString; using (OdbcConnection con = new OdbcConnection(connString)) { con.Open(); System.Data.Odbc.OdbcCommand objCmd = new OdbcCommand("SELECT * FROM [loginusers$]"); objCmd.Connection = con; OdbcDataAdapter adapter = new OdbcDataAdapter(objCmd); DataSet ds = new DataSet(); adapter.Fill(ds); Assert.IsTrue(ds.Tables[0].Rows.Count > 1); } } any ideas ?

    Read the article

  • PLKs and Web Service Software Factory

    - by Nix
    We found a bug in Web Service Software Factory a description can be found here. There has been no updates on it so we decided to download the code and fix it ourself. Very simple bug and we patched it with maybe 3 lines of code. However* we have now tried to repackage it and use it and are finding that this is seemingly an impossible process. Can someone please explain to me the process of PLKs? I have read all about them but still don't understand what is really required to distribute a VS package. I was able to get it to load and run using a PLK obtained from here, but i am assuming that you have to be a partner to get a functional PLK that will be recognized on other peoples systems? Every time i try and install this on a different computer I get a "Package Load Failure". Is the reason I am getting errors because I am not using a partner key? Is there any other way around this? For instance is there any way we can have an "internal" VS package that we can distribute? Edit Files I had to change to get it to work. First run devenv PostInstall.proj Generate your plks and replace ##Package PLK## (.resx files) --Just note that package version is not the class name but is "Web Service Software Factory: Modeling Edition" -- And you need to remove the new lines from the key ProductDefinitionRegistryFragment.wxi line 1252(update version to whatever version you used in plk) Uncomment all // [VSShell::ProvideLoadKey("Standard", Constant in .tt files.

    Read the article

  • Problem passing ELMAH log id to Custom Error page in ASP.NET

    - by Ronnie Overby
    I am using ELMAH to log unhandled exceptions in an ASP.NET Webforms application. Logging is working fine. I want to pass the ELMAH error log id to a custom error page that will give the user the ability to email an administrator about the error. I have followed the advice from this answer. Here is my global.asax code: void ErrorLog_Logged(object sender, ErrorLoggedEventArgs args) { Session[StateKeys.ElmahLogId] = args.Entry.Id; // this doesn't work either: // HttpContext.Current.Items[StateKeys.ElmahLogId] = args.Entry.Id; } But, on the Custom error page, the session variable reference and HttpContext.Current.Items are giving me a NullReference exception. How can I pass the ID to my custom error page?

    Read the article

  • Environment variables get lost between MSBuild projects

    - by DotNetter
    Hi, I have a .NET solution containing following projects: web application (WAP) web deployment (WDP, .wdproj) wix setup (WIX, .wixproj) In the WDP I've used a custom MSBuild task (SetEnvVar) to set some env. variables for further use in the build process. After setting them I can use them without prob. in the WDP but in the WIX they are empty/undefined. The strange thing is that when I reference those env. vars in the WIX files (by using properties in .wxs or preproc vars in .wxi) I get the values as expected. Do you have any idea why the env. vars get lost/are undefined in .wixproj? By the way the (solution) build process is triggered from inside VS 2010.

    Read the article

  • Cannot resolve IHttpHandler

    - by baron
    For some reason when I am trying to create a class which implements IHttpHandler I cannot resolve IHttpHandler. Statements like: using System.Web; are not helping either. This is a class library project, I am following example here: http://www.15seconds.com/issue/020417.htm What am I doing wrong?

    Read the article

  • Using jQuery for client side validation in MVC2 RTM

    - by tigermain
    Scott Gu's tutorial on Model validation gets us all set up with the MS client side validation using the following scripts: <script src="../../Scripts/jquery.validate.min.js" type="text/javascript"></script> <script src="../../Scripts/MicrosoftMvcValidation.js" type="text/javascript"></script> However I've seen various posts allowing us to utilise jQuery instead with the following code: <script src="https://ajax.microsoft.com/ajax/jquery/jquery-1.4.2.min.js" type="text/javascript"></script> <script src="https://ajax.microsoft.com/ajax/jQuery.Validate/1.6/jQuery.Validate.min.js" type="text/javascript"></script> <script src="<%= Url.Content("~/scripts/MicrosoftMvcJQueryValidation.js") %>" type="text/javascript"></script> However MicrosoftMvcJQueryValidation.js does not ship with the solution and from what I read it should be part of the Futures pack which is no longer available on CodePlex. I managed to find a version alongside jQuery 1.3.2 but it does not work. What is the forward going solution!?

    Read the article

  • Logging errors in SCSF

    - by WF
    I'm quite new to SCSF. So, I'm developping a SCSF Winforms in C# (using May 2007 version in VSNet 2005 Fwk2.0, I can't use new version). I've implemented a Business module. What is the best practise to log errors? I've configured the Logging Application Block. But how to use that ? Thanks for answers

    Read the article

  • forward/strong enum in VS2010

    - by Noah Roberts
    At http://blogs.msdn.com/vcblog/archive/2010/04/06/c-0x-core-language-features-in-vc10-the-table.aspx there is a table showing C++0x features that are implemented in 2010 RC. Among them are listed forwarding enums and strongly typed enums but they are listed as "partial". The main text of the article says that this means they are either incomplete or implemented in some non-standard way. So I've got VS2010RC and am playing around with the C++0x features. I can't figure these ones out and can't find any documentation on these two features. Not even the simplest attempts compile. enum class E { test }; int main() {} fails with: 1e:\dev_workspace\experimental\2010_feature_assessment\2010_feature_assessment\main.cpp(518): error C2332: 'enum' : missing tag name 1e:\dev_workspace\experimental\2010_feature_assessment\2010_feature_assessment\main.cpp(518): error C2236: unexpected 'class' 'E'. Did you forget a ';'? 1e:\dev_workspace\experimental\2010_feature_assessment\2010_feature_assessment\main.cpp(518): error C3381: 'E' : assembly access specifiers are only available in code compiled with a /clr option 1e:\dev_workspace\experimental\2010_feature_assessment\2010_feature_assessment\main.cpp(518): error C2143: syntax error : missing ';' before '}' 1e:\dev_workspace\experimental\2010_feature_assessment\2010_feature_assessment\main.cpp(518): error C4430: missing type specifier - int assumed. Note: C++ does not support default-int ========== Build: 0 succeeded, 1 failed, 0 up-to-date, 0 skipped ========== int main() { enum E : short; } Fails with: 1e:\dev_workspace\experimental\2010_feature_assessment\2010_feature_assessment\main.cpp(513): warning C4480: nonstandard extension used: specifying underlying type for enum 'main::E' 1e:\dev_workspace\experimental\2010_feature_assessment\2010_feature_assessment\main.cpp(513): error C2059: syntax error : ';' ========== Build: 0 succeeded, 1 failed, 0 up-to-date, 0 skipped ========== So it seems it must be some totally non-standard implementation that has allowed them to justify calling this feature "partially" done. How would I rewrite that code to access the forwarding and strong type feature?

    Read the article

  • Create MSI for slipstream SQL 2008 install

    - by Graham
    I am looking at creating an MSI that will check/install the prerequisites of SQL Server 2008 and after the prerequisites are installed I will start my slipstream 2008 install. I am currently trying to do this through VS 2008 Deployment Projects, but I cannot just simple add the folders of the SQL slipstream into the setup project without recreating the entire folder structure of the SQL install. So my questions are: Is this a possible to do through Deployment Projects? (if yes, please assist with links or help) Is there a better way to do this? (I would prefer not to use WISE)

    Read the article

  • How do I compile boost using __cdecl calling convention?

    - by Sorin Sbarnea
    I have a project compiled using __cdecl calling convention (msvc2010) and I compiled boost using the same compiler using the default settings. The project linked with boost but I at runtime I got an assert message like this: File: ...\boost\boost\program_options\detail\parsers.hpp Line: 79 Run-Time Check Failure #0 - The value of ESP was not properly saved across a function call. This is usually a result of calling a function declared with one calling convention with a function pointer declared with a different calling convention. There are the following questions: what calling convention does boost build with by default on Windows (msvc2010) how to I compile boost with __cdecl calling convention why boost wasn't able to prevent linking with code with different calling conventions? I understood that boost has really smart library auto-inclusion code.

    Read the article

  • How can I stop an auto-generated Linq to SQL class from loading ALL data?

    - by Gary McGill
    DUPLICATE of http://stackoverflow.com/questions/2433422/how-can-i-stop-an-auto-generated-linq-to-sql-class-from-loading-all-data post answers there! I have an ASP.NET MVC project, much like the NerdDinner tutorial example. (I'm using MVC 2, but followed the NerdDinner tutorial in order to create it). As per the instructions in part 3 of the tutorial, I've created a Linq-to-SQL model of my database by creating a "Linq to SQL Classes" (.dbml) surface, and dropping my database tables onto it. The designer has automatically added relationships between the generated classes based on my database tables. Let's say that my classes are as per the NerdDinner example, so I have Dinner and RSVP tables, where each Dinner record is associated with many RSVP records - hence in the generated classes, the Dinner object has a RSVPs property which is a list of RSVP objects. My problem is this: it appears (and I'd be gladly proved wrong on this) that as soon as I access a Dinner object, it's loading all of the corresponding RSVP objects, even if I don't use the RSVPs member. First question: is this really the default behavior for the generated classes? In my particular situation, the object graph contains many more tables (which have an order of magnitude more records), and so this is disastrous behaviour - I'd be loading tons of data when all I want to do is show the details of a single parent record. Second question: are there any properties exposed through the designer UI that would let me modify this behavior? (I can't find any). Third question: I've seen a description of how to control the loading of related records in a DataContext by using a DataShape object associated with the DataContext. Is that what I'm meant to do, and if so are there any tutorials like the NerdDinner one that would show not only how to do it, but also suggest a 'pattern' for normal use?

    Read the article

< Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >