Search Results

Search found 24117 results on 965 pages for 'write'.

Page 890/965 | < Previous Page | 886 887 888 889 890 891 892 893 894 895 896 897  | Next Page >

  • Adaptive Connections For ADFBC

    - by Duncan Mills
    Some time ago I wrote an article on Adaptive Bindings showing how the pageDef for a an ADF UI does not have to be wedded to a fixed data control or collection / View Object. This article has proved pretty popular, so as a follow up I wanted to cover another "Adaptive" feature of your ADF applications, the ability to make multiple different connections from an Application Module, at runtime. Now, I'm sure you'll be aware that if you define your application to use a data-source rather than a hard-coded JDBC connection string, then you have the ability to change the target of that data-source after deployment to point to a different database. So that's great, but the reality of that is that this single connection is effectively fixed within the application right?  Well no, this it turns out is a common misconception. To be clear, yes a single instance of an ADF Application Module is associated with a single connection but there is nothing to stop you from creating multiple instances of the same Application Module within the application, all pointing at different connections.  If fact this has been possible for a long time using a custom extension point with code that which extends oracle.jbo.http.HttpSessionCookieFactory. This approach, however, involves writing code and no-one likes to write any more code than they need to, so, is there an easier way? Yes indeed.  It is in fact  a little publicized feature that's available in all versions of 11g, the ELEnvInfoProvider. What Does it Do?  The ELEnvInfoProvider  is  a pre-existing class (the full path is  oracle.jbo.client.ELEnvInfoProvider) which you can plug into your ApplicationModule configuration using the jbo.envinfoprovider property. Visuallty you can set this in the editor, or you can also set it directly in the bc4j.xcfg (see below for an example) . Once you have plugged in this envinfoprovider, here's the fun bit, rather than defining the hard-coded name of a datasource instead you can plug in a EL expression for the connection to use.  So what's the benefit of that? Well it allows you to defer the selection of a connection until the point in time that you instantiate the AM. To define the expression itself you'll need to do a couple of things: First of all you'll need a managed bean of some sort – e.g. a sessionScoped bean defined in your ViewController project. This will need a getter method that returns the name of the connection. Now this connection itself needs to be defined in your Application Server, and can be managed through Enterprise Manager, WLST or through MBeans. (You may need to read the documentation [http://docs.oracle.com/cd/E28280_01/web.1111/b31974/deployment_topics.htm#CHDJGBDD] here on how to configure connections at runtime if you're not familiar with this)   The EL expression (e.g. ${connectionManager.connection} is then defined in the configuration by editing the bc4j.xcfg file (there is a hyperlink directly to this file on the configuration editing screen in the Application Module editor). You simply replace the hardcoded JDBCName value with the expression.  So your cfg file would end up looking something like this (notice the reference to the ELEnvInfoProvider that I talked about earlier) <BC4JConfig version="11.1" xmlns="http://xmlns.oracle.com/bc4j/configuration">   <AppModuleConfigBag ApplicationName="oracle.demo.model.TargetAppModule">   <AppModuleConfig DeployPlatform="LOCAL"  JDBCName="${connectionManager.connection}" jbo.project="oracle.demo.model.Model" name="TargetAppModuleLocal" ApplicationName="oracle.demo.model.TargetAppModule"> <AM-Pooling jbo.doconnectionpooling="true"/> <Database jbo.locking.mode="optimistic">       <Security AppModuleJndiName="oracle.demo.model.TargetAppModule"/>    <Custom jbo.envinfoprovider="oracle.jbo.client.ELEnvInfoProvider"/> </AppModuleConfig> </AppModuleConfigBag> </BC4JConfig> Still Don't Quite Get It? So far you might be thinking, well that's fine but what difference does it make if the connection is resolved "just in time" rather than up front and changed as required through Enterprise Manager? Well a trivial example would be where you have a single application deployed to your application server, but for different users you want to connect to different databases. Because, the evaluation of the connection is deferred until you first reference the AM you have a decision point that can take the user identity into account. However, think about it for a second.  Under what circumstances does a new AM get instantiated? Well at the first reference of the AM within the application yes, but also whenever a Task Flow is entered -  if the data control scope for the Task Flow is ISOLATED.  So the reality is, that on a single screen you can embed multiple Task Flows, all of which are pointing at different database connections concurrently. Hopefully you'll find this feature useful, let me know... 

    Read the article

  • BIP 11g Dynamic SQL

    - by Tim Dexter
    Back in the 10g release, if you wanted something beyond the standard query for your report extract; you needed to break out your favorite text editor. You gotta love 'vi' and hate emacs, am I right? And get to building a data template, they were/are lovely to write, such fun ... not! Its not fun writing them by hand but, you do get to do some cool stuff around the data extract including dynamic SQL. By that I mean the ability to add content dynamically to your your query at runtime. With 11g, we spoiled you with a visual builder, no more vi or notepad sessions, a friendly drag and drop interface allowing you to build hierarchical data sets, calculated columns, summary columns, etc. You can still create the dynamic SQL statements, its not so well documented right now, in lieu of doc updates here's the skinny. If you check out the 10g process to create dynamic sql in the docs. You need to create a data trigger function where you assign the dynamic sql to a global variable that's matched in your report SQL. In 11g, the process is really the same, BI Publisher just provides a bit more help to define what trigger code needs to be called. You still need to create the function and place it inside a package in the db. Here's a simple plsql package with the 'beforedata' function trigger. Spec create or replace PACKAGE BIREPORTS AS whereCols varchar2(2000); FUNCTION beforeReportTrig return boolean; end BIREPORTS; Body create or replace PACKAGE BODY BIREPORTS AS   FUNCTION beforeReportTrig return boolean AS   BEGIN       whereCols := ' and d.department_id = 100';     RETURN true;   END beforeReportTrig; END BIREPORTS; you'll notice the additional where clause (whereCols - declared as a public variable) is hard coded. I'll cover parameterizing that in my next post. If you can not wait, check the 10g docs for an example. I have my package compiling successfully in the db. Now, onto the BIP data model definition. 1. Create a new data model and go ahead and create your query(s) as you would normally. 2. In the query dialog box, add in the variables you want replaced at runtime using an ampersand rather than a colon e.g. &whereCols.   select     d.DEPARTMENT_NAME, ...  from    "OE"."EMPLOYEES" e,     "OE"."DEPARTMENTS" d  where   d."DEPARTMENT_ID"= e."DEPARTMENT_ID" &whereCols   Note that 'whereCols' matches the global variable name in our package. When you click OK to clear the dialog, you'll be asked for a default value for the variable, just use ' and 1=1' That leading space is important to keep the SQL valid ie required whitespace. This value will be used for the where clause if case its not set by the function code. 3. Now click on the Event Triggers tree node and create a new trigger of the type Before Data. Type in the default package name, in my example, 'BIREPORTS'. Then hit the update button to get BIP to fetch the valid functions.In my case I get to see the following: Select the BEFOREREPORTTRIG function (or your name) and shuttle it across. 4. Save your data model and now test it. For now, you can update the where clause via the plsql package. Next time ... parametrizing the dynamic clause.

    Read the article

  • Web optimization

    - by hmloo
    1. CSS Optimization Organize your CSS code Good CSS organization helps with future maintainability of the site, it helps you and your team member understand the CSS more quickly and jump to specific styles. Structure CSS code For small project, you can break your CSS code in separate blocks according to the structure of the page or page content. for example you can break your CSS document according the content of your web page(e.g. Header, Main Content, Footer) Structure CSS file For large project, you may feel having too much CSS code in one place, so it's the best to structure your CSS into more CSS files, and use a master style sheet to import these style sheets. this solution can not only organize style structure, but also reduce server request./*--------------Master style sheet--------------*/ @import "Reset.css"; @import "Structure.css"; @import "Typography.css"; @import "Forms.css"; Create index for your CSS Another important thing is to create index at the beginning of your CSS file, index can help you quickly understand the whole CSS structure./*---------------------------------------- 1. Header 2. Navigation 3. Main Content 4. Sidebar 5. Footer ------------------------------------------*/ Writing efficient CSS selectors keep in mind that browsers match CSS selectors from right to left and the order of efficiency for selectors 1. id (#myid) 2. class (.myclass) 3. tag (div, h1, p) 4. adjacent sibling (h1 + p) 5. child (ul > li) 6. descendent (li a) 7. universal (*) 8. attribute (a[rel="external"]) 9. pseudo-class and pseudo element (a:hover, li:first) the rightmost selector is called "key selector", so when you write your CSS code, you should choose more efficient key selector. Here are some best practice: Don't tag-qualify Never do this:div#myid div.myclass .myclass#myid IDs are unique, classes are more unique than a tag so they don't need a tag. Doing so makes the selector less efficient. Avoid overqualifying selectors for example#nav a is more efficient thanul#nav li a Don't repeat declarationExample: body {font-size:12px;}h1 {font-size:12px;font-weight:bold;} since h1 is already inherited from body, so you don't need to repeate atrribute. Using 0 instead of 0px Always using #selector { margin: 0; } There’s no need to include the px after 0, removing all those superfluous px can reduce the size of your CSS file. Group declaration Example: h1 { font-size: 16pt; } h1 { color: #fff; } h1 { font-family: Arial, sans-serif; } it’s much better to combine them:h1 { font-size: 16pt; color: #fff; font-family: Arial, sans-serif; } Group selectorsExample: h1 { color: #fff; font-family: Arial, sans-serif; } h2 { color: #fff; font-family: Arial, sans-serif; } it would be much better if setup as:h1, h2 { color: #fff; font-family: Arial, sans-serif; } Group attributeExample: h1 { color: #fff; font-family: Arial, sans-serif; } h2 { color: #fff; font-family: Arial, sans-serif; font-size: 16pt; } you can set different rules for specific elements after setting a rule for a grouph1, h2 { color: #fff; font-family: Arial, sans-serif; } h2 { font-size: 16pt; } Using Shorthand PropertiesExample: #selector { margin-top: 8px; margin-right: 4px; margin-bottom: 8px; margin-left: 4px; }Better: #selector { margin: 8px 4px 8px 4px; }Best: #selector { margin: 8px 4px; } a good diagram illustrated how shorthand declarations are interpreted depending on how many values are specified for margin and padding property. instead of using:#selector { background-image: url(”logo.png”); background-position: top left; background-repeat: no-repeat; } is used:#selector { background: url(logo.png) no-repeat top left; } 2. Image Optimization Image Optimizer Image Optimizer is a free Visual Studio2010 extension that optimizes PNG, GIF and JPG file sizes without quality loss. It uses SmushIt and PunyPNG for the optimization. Just right click on any folder or images in Solution Explorer and choose optimize images, then it will automatically optimize all PNG, GIF and JPEG files in that folder. CSS Image Sprites CSS Image Sprites are a way to combine a collection of images to a single image, then use CSS background-position property to shift the visible area to show the required image, many images can take a long time to load and generates multiple server requests, so Image Sprite can reduce the number of server requests and improve site performance. You can use many online tools to generate your image sprite and CSS, and you can also try the Sprite and Image Optimization framework released by The ASP.NET team.

    Read the article

  • Part 4 of 4 : Tips/Tricks for Silverlight Developers.

    - by mbcrump
    Part 1 | Part 2 | Part 3 | Part 4 I wanted to create a series of blog post that gets right to the point and is aimed specifically at Silverlight Developers. The most important things I want this series to answer is : What is it?  Why do I care? How do I do it? I hope that you enjoy this series. Let’s get started: Tip/Trick #16) What is it? Find out version information about Silverlight and which WebKit it is using by going to http://issilverlightinstalled.com/scriptverify/. Why do I care? I’ve had those users that its just easier to give them a site and say copy/paste the line that says User Agent in order to troubleshoot a Silverlight problem. I’ve also been debugging my own Silverlight applications and needed an easy way to determine if the plugin is disabled or not. How do I do it: Simply navigate to http://issilverlightinstalled.com/scriptverify/ and hit the Verify button. An example screenshot is located below: Results from Chrome 7 Results from Internet Explorer 8 (With Silverlight Disabled) Tip/Trick #17) What is it? Use Lambdas whenever you can. Why do I care?  It is my personal opinion that code is easier to read using Lambdas after you get past the syntax. How do I do it: For example: You may write code like the following: void MainPage_Loaded(object sender, RoutedEventArgs e) { //Check and see if we have a newer .XAP file on the server Application.Current.CheckAndDownloadUpdateAsync(); Application.Current.CheckAndDownloadUpdateCompleted += new CheckAndDownloadUpdateCompletedEventHandler(Current_CheckAndDownloadUpdateCompleted); } void Current_CheckAndDownloadUpdateCompleted(object sender, CheckAndDownloadUpdateCompletedEventArgs e) { if (e.UpdateAvailable) { MessageBox.Show( "An update has been installed. To see the updates please exit and restart the application"); } } To me this style forces me to look for the other Method to see what the code is actually doing. The style located below is much easier to read in my opinion and does the exact same thing. void MainPage_Loaded(object sender, RoutedEventArgs e) { //Check and see if we have a newer .XAP file on the server Application.Current.CheckAndDownloadUpdateAsync(); Application.Current.CheckAndDownloadUpdateCompleted += (s, e) => { if (e.UpdateAvailable) { MessageBox.Show( "An update has been installed. To see the updates please exit and restart the application"); } }; } Tip/Trick #18) What is it? Prevent development Web Service references from breaking when Visual Studio auto generates a new port number. Why do I care?  We have all been there, we are developing a Silverlight Application and all of a sudden our development web services break. We check and find out that the local port number that Visual Studio assigned has changed and now we need up to update all of our service references. We need a way to stop this. How do I do it: This can actually be prevented with just a few mouse click. Right click on your web solution and goto properties. Click the tab that says, Web. You just need to click the radio button and specify a port number. Now you won’t be bothered with that anymore. Tip/Trick #19) What is it? You can disable the Close Button a ChildWindow. Why do I care?  I wouldn’t blog about it if I hadn’t seen it. Devs trying to override keystrokes to prevent users from closing a Child Window. How do I do it: A property exist on the ChildWindow called “HasCloseButton”, you simply change that to false and your close button is gone. You can delete the “Cancel” button and add some logic to the OK button if you want the user to respond before proceeding. Tip/Trick #20) What is it? Cleanup your XAML. Why do I care?  By removing unneeded namespaces, not naming all of your controls and getting rid of designer markup you can improve code quality and readability. How do I do it: (This is a 3 in one tip) Remove unused Designer markup: 1) Have you ever wondered what the following code snippet does? xmlns:d="http://schemas.microsoft.com/expression/blend/2008" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" mc:Ignorable="d" d:DesignWidth="640" d:DesignHeight="480" This code is telling the designer to do something special with this page in “Design mode” Specifically the width and the height of the page. When its running in the browser it will not use this information and it is actually ignored by the XAML parser. In other words, if you don’t need it then delete it. 2) If you are not using a namespace then remove it. In the code sample below, I am using Resharper which will tell me the ones that I’m not using by the grayed out line below. If you don’t have resharper you can look in your XAML and manually remove the unneeded namespaces. 3) Don’t name an control unless you actually need to refer to it in procedural code. If you name a control you will take a slight performance hit that is totally unnecessary if its not being called. <TextBlock Height="23" Text="TextBlock" />   That is the end of the series. I hope that you enjoyed it and please check out Part 1 | Part 2 | Part 3 if your hungry for more.  Subscribe to my feed CodeProject

    Read the article

  • Plagued by multithreaded bugs

    - by koncurrency
    On my new team that I manage, the majority of our code is platform, TCP socket, and http networking code. All C++. Most of it originated from other developers that have left the team. The current developers on the team are very smart, but mostly junior in terms of experience. Our biggest problem: multi-threaded concurrency bugs. Most of our class libraries are written to be asynchronous by use of some thread pool classes. Methods on the class libraries often enqueue long running taks onto the thread pool from one thread and then the callback methods of that class get invoked on a different thread. As a result, we have a lot of edge case bugs involving incorrect threading assumptions. This results in subtle bugs that go beyond just having critical sections and locks to guard against concurrency issues. What makes these problems even harder is that the attempts to fix are often incorrect. Some mistakes I've observed the team attempting (or within the legacy code itself) includes something like the following: Common mistake #1 - Fixing concurrency issue by just put a lock around the shared data, but forgetting about what happens when methods don't get called in an expected order. Here's a very simple example: void Foo::OnHttpRequestComplete(statuscode status) { m_pBar->DoSomethingImportant(status); } void Foo::Shutdown() { m_pBar->Cleanup(); delete m_pBar; m_pBar=nullptr; } So now we have a bug in which Shutdown could get called while OnHttpNetworkRequestComplete is occuring on. A tester finds the bug, captures the crash dump, and assigns the bug to a developer. He in turn fixes the bug like this. void Foo::OnHttpRequestComplete(statuscode status) { AutoLock lock(m_cs); m_pBar->DoSomethingImportant(status); } void Foo::Shutdown() { AutoLock lock(m_cs); m_pBar->Cleanup(); delete m_pBar; m_pBar=nullptr; } The above fix looks good until you realize there's an even more subtle edge case. What happens if Shutdown gets called before OnHttpRequestComplete gets called back? The real world examples my team has are even more complex, and the edge cases are even harder to spot during the code review process. Common Mistake #2 - fixing deadlock issues by blindly exiting the lock, wait for the other thread to finish, then re-enter the lock - but without handling the case that the object just got updated by the other thread! Common Mistake #3 - Even though the objects are reference counted, the shutdown sequence "releases" it's pointer. But forgets to wait for the thread that is still running to release it's instance. As such, components are shutdown cleanly, then spurious or late callbacks are invoked on an object in an state not expecting any more calls. There are other edge cases, but the bottom line is this: Multithreaded programming is just plain hard, even for smart people. As I catch these mistakes, I spend time discussing the errors with each developer on developing a more appropriate fix. But I suspect they are often confused on how to solve each issue because of the enormous amount of legacy code that the "right" fix will involve touching. We're going to be shipping soon, and I'm sure the patches we're applying will hold for the upcoming release. Afterwards, we're going to have some time to improve the code base and refactor where needed. We won't have time to just re-write everything. And the majority of the code isn't all that bad. But I'm looking to refactor code such that threading issues can be avoided altogether. One approach I am considering is this. For each significant platform feature, have a dedicated single thread where all events and network callbacks get marshalled onto. Similar to COM apartment threading in Windows with use of a message loop. Long blocking operations could still get dispatched to a work pool thread, but the completion callback is invoked on on the component's thread. Components could possibly even share the same thread. Then all the class libraries running inside the thread can be written under the assumption of a single threaded world. Before I go down that path, I am also very interested if there are other standard techniques or design patterns for dealing with multithreaded issues. And I have to emphasize - something beyond a book that describes the basics of mutexes and semaphores. What do you think? I am also interested in any other approaches to take towards a refactoring process. Including any of the following: Literature or papers on design patterns around threads. Something beyond an introduction to mutexes and semaphores. We don't need massive parallelism either, just ways to design an object model so as to handle asynchronous events from other threads correctly. Ways to diagram the threading of various components, so that it will be easy to study and evolve solutions for. (That is, a UML equivalent for discussing threads across objects and classes) Educating your development team on the issues with multithreaded code. What would you do?

    Read the article

  • Invalid algorithm specified on Windows 2003 Server only

    - by JL
    I am decoding a file using the following method: string outFileName = zfoFileName.Replace(".zfo", "_tmp.zfo"); FileStream inFile = null; FileStream outFile = null; inFile = File.Open(zfoFileName, FileMode.Open); outFile = File.Create(outFileName); LargeCMS.CMS cms = new LargeCMS.CMS(); cms.Decode(inFile, outFile); This is working fine on my Win 7 dev machine, but on a Windows 2003 server production machine it fails with the following exception: Exception: System.Exception: CryptMsgUpdate error #-2146893816 --- System.ComponentModel.Win32Exception: Invalid algorithm specified --- End of inner exception stack trace --- at LargeCMS.CMS.Decode(FileStream inFile, FileStream outFile) Here are the classes below which I call to do the decoding, if needed I can upload a sample file for decoding, its just strange it works on Win 7, and not on Win2k3 server: using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.IO; using System.Security.Cryptography; using System.Security.Cryptography.X509Certificates; using System.Runtime.InteropServices; using System.ComponentModel; namespace LargeCMS { class CMS { // File stream to use in callback function private FileStream m_callbackFile; // Streaming callback function for encoding private Boolean StreamOutputCallback(IntPtr pvArg, IntPtr pbData, int cbData, Boolean fFinal) { // Write all bytes to encoded file Byte[] bytes = new Byte[cbData]; Marshal.Copy(pbData, bytes, 0, cbData); m_callbackFile.Write(bytes, 0, cbData); if (fFinal) { // This is the last piece. Close the file m_callbackFile.Flush(); m_callbackFile.Close(); m_callbackFile = null; } return true; } // Encode CMS with streaming to support large data public void Encode(X509Certificate2 cert, FileStream inFile, FileStream outFile) { // Variables Win32.CMSG_SIGNER_ENCODE_INFO SignerInfo; Win32.CMSG_SIGNED_ENCODE_INFO SignedInfo; Win32.CMSG_STREAM_INFO StreamInfo; Win32.CERT_CONTEXT[] CertContexts = null; Win32.BLOB[] CertBlobs; X509Chain chain = null; X509ChainElement[] chainElements = null; X509Certificate2[] certs = null; RSACryptoServiceProvider key = null; BinaryReader stream = null; GCHandle gchandle = new GCHandle(); IntPtr hProv = IntPtr.Zero; IntPtr SignerInfoPtr = IntPtr.Zero; IntPtr CertBlobsPtr = IntPtr.Zero; IntPtr hMsg = IntPtr.Zero; IntPtr pbPtr = IntPtr.Zero; Byte[] pbData; int dwFileSize; int dwRemaining; int dwSize; Boolean bResult = false; try { // Get data to encode dwFileSize = (int)inFile.Length; stream = new BinaryReader(inFile); pbData = stream.ReadBytes(dwFileSize); // Prepare stream for encoded info m_callbackFile = outFile; // Get cert chain chain = new X509Chain(); chain.Build(cert); chainElements = new X509ChainElement[chain.ChainElements.Count]; chain.ChainElements.CopyTo(chainElements, 0); // Get certs in chain certs = new X509Certificate2[chainElements.Length]; for (int i = 0; i < chainElements.Length; i++) { certs[i] = chainElements[i].Certificate; } // Get context of all certs in chain CertContexts = new Win32.CERT_CONTEXT[certs.Length]; for (int i = 0; i < certs.Length; i++) { CertContexts[i] = (Win32.CERT_CONTEXT)Marshal.PtrToStructure(certs[i].Handle, typeof(Win32.CERT_CONTEXT)); } // Get cert blob of all certs CertBlobs = new Win32.BLOB[CertContexts.Length]; for (int i = 0; i < CertContexts.Length; i++) { CertBlobs[i].cbData = CertContexts[i].cbCertEncoded; CertBlobs[i].pbData = CertContexts[i].pbCertEncoded; } // Get CSP of client certificate key = (RSACryptoServiceProvider)certs[0].PrivateKey; bResult = Win32.CryptAcquireContext( ref hProv, key.CspKeyContainerInfo.KeyContainerName, key.CspKeyContainerInfo.ProviderName, key.CspKeyContainerInfo.ProviderType, 0 ); if (!bResult) { throw new Exception("CryptAcquireContext error #" + Marshal.GetLastWin32Error().ToString(), new Win32Exception(Marshal.GetLastWin32Error())); } // Populate Signer Info struct SignerInfo = new Win32.CMSG_SIGNER_ENCODE_INFO(); SignerInfo.cbSize = Marshal.SizeOf(SignerInfo); SignerInfo.pCertInfo = CertContexts[0].pCertInfo; SignerInfo.hCryptProvOrhNCryptKey = hProv; SignerInfo.dwKeySpec = (int)key.CspKeyContainerInfo.KeyNumber; SignerInfo.HashAlgorithm.pszObjId = Win32.szOID_OIWSEC_sha1; // Populate Signed Info struct SignedInfo = new Win32.CMSG_SIGNED_ENCODE_INFO(); SignedInfo.cbSize = Marshal.SizeOf(SignedInfo); SignedInfo.cSigners = 1; SignerInfoPtr = Marshal.AllocHGlobal(Marshal.SizeOf(SignerInfo)); Marshal.StructureToPtr(SignerInfo, SignerInfoPtr, false); SignedInfo.rgSigners = SignerInfoPtr; SignedInfo.cCertEncoded = CertBlobs.Length; CertBlobsPtr = Marshal.AllocHGlobal(Marshal.SizeOf(CertBlobs[0]) * CertBlobs.Length); for (int i = 0; i < CertBlobs.Length; i++) { Marshal.StructureToPtr(CertBlobs[i], new IntPtr(CertBlobsPtr.ToInt64() + (Marshal.SizeOf(CertBlobs[i]) * i)), false); } SignedInfo.rgCertEncoded = CertBlobsPtr; // Populate Stream Info struct StreamInfo = new Win32.CMSG_STREAM_INFO(); StreamInfo.cbContent = dwFileSize; StreamInfo.pfnStreamOutput = new Win32.StreamOutputCallbackDelegate(StreamOutputCallback); // TODO: CMSG_DETACHED_FLAG // Open message to encode hMsg = Win32.CryptMsgOpenToEncode( Win32.X509_ASN_ENCODING | Win32.PKCS_7_ASN_ENCODING, 0, Win32.CMSG_SIGNED, ref SignedInfo, null, ref StreamInfo ); if (hMsg.Equals(IntPtr.Zero)) { throw new Exception("CryptMsgOpenToEncode error #" + Marshal.GetLastWin32Error().ToString(), new Win32Exception(Marshal.GetLastWin32Error())); } // Process the whole message gchandle = GCHandle.Alloc(pbData, GCHandleType.Pinned); pbPtr = gchandle.AddrOfPinnedObject(); dwRemaining = dwFileSize; dwSize = (dwFileSize < 1024 * 1000 * 100) ? dwFileSize : 1024 * 1000 * 100; while (dwRemaining > 0) { // Update message piece by piece bResult = Win32.CryptMsgUpdate( hMsg, pbPtr, dwSize, (dwRemaining <= dwSize) ? true : false ); if (!bResult) { throw new Exception("CryptMsgUpdate error #" + Marshal.GetLastWin32Error().ToString(), new Win32Exception(Marshal.GetLastWin32Error())); } // Move to the next piece pbPtr = new IntPtr(pbPtr.ToInt64() + dwSize); dwRemaining -= dwSize; if (dwRemaining < dwSize) { dwSize = dwRemaining; } } } finally { // Clean up if (gchandle.IsAllocated) { gchandle.Free(); } if (stream != null) { stream.Close(); } if (m_callbackFile != null) { m_callbackFile.Close(); } if (!CertBlobsPtr.Equals(IntPtr.Zero)) { Marshal.FreeHGlobal(CertBlobsPtr); } if (!SignerInfoPtr.Equals(IntPtr.Zero)) { Marshal.FreeHGlobal(SignerInfoPtr); } if (!hProv.Equals(IntPtr.Zero)) { Win32.CryptReleaseContext(hProv, 0); } if (!hMsg.Equals(IntPtr.Zero)) { Win32.CryptMsgClose(hMsg); } } } // Decode CMS with streaming to support large data public void Decode(FileStream inFile, FileStream outFile) { // Variables Win32.CMSG_STREAM_INFO StreamInfo; Win32.CERT_CONTEXT SignerCertContext; BinaryReader stream = null; GCHandle gchandle = new GCHandle(); IntPtr hMsg = IntPtr.Zero; IntPtr pSignerCertInfo = IntPtr.Zero; IntPtr pSignerCertContext = IntPtr.Zero; IntPtr pbPtr = IntPtr.Zero; IntPtr hStore = IntPtr.Zero; Byte[] pbData; Boolean bResult = false; int dwFileSize; int dwRemaining; int dwSize; int cbSignerCertInfo; try { // Get data to decode dwFileSize = (int)inFile.Length; stream = new BinaryReader(inFile); pbData = stream.ReadBytes(dwFileSize); // Prepare stream for decoded info m_callbackFile = outFile; // Populate Stream Info struct StreamInfo = new Win32.CMSG_STREAM_INFO(); StreamInfo.cbContent = dwFileSize; StreamInfo.pfnStreamOutput = new Win32.StreamOutputCallbackDelegate(StreamOutputCallback); // Open message to decode hMsg = Win32.CryptMsgOpenToDecode( Win32.X509_ASN_ENCODING | Win32.PKCS_7_ASN_ENCODING, 0, 0, IntPtr.Zero, IntPtr.Zero, ref StreamInfo ); if (hMsg.Equals(IntPtr.Zero)) { throw new Exception("CryptMsgOpenToDecode error #" + Marshal.GetLastWin32Error().ToString(), new Win32Exception(Marshal.GetLastWin32Error())); } // Process the whole message gchandle = GCHandle.Alloc(pbData, GCHandleType.Pinned); pbPtr = gchandle.AddrOfPinnedObject(); dwRemaining = dwFileSize; dwSize = (dwFileSize < 1024 * 1000 * 100) ? dwFileSize : 1024 * 1000 * 100; while (dwRemaining > 0) { // Update message piece by piece bResult = Win32.CryptMsgUpdate( hMsg, pbPtr, dwSize, (dwRemaining <= dwSize) ? true : false ); if (!bResult) { throw new Exception("CryptMsgUpdate error #" + Marshal.GetLastWin32Error().ToString(), new Win32Exception(Marshal.GetLastWin32Error())); } // Move to the next piece pbPtr = new IntPtr(pbPtr.ToInt64() + dwSize); dwRemaining -= dwSize; if (dwRemaining < dwSize) { dwSize = dwRemaining; } } // Get signer certificate info cbSignerCertInfo = 0; bResult = Win32.CryptMsgGetParam( hMsg, Win32.CMSG_SIGNER_CERT_INFO_PARAM, 0, IntPtr.Zero, ref cbSignerCertInfo ); if (!bResult) { throw new Exception("CryptMsgGetParam error #" + Marshal.GetLastWin32Error().ToString(), new Win32Exception(Marshal.GetLastWin32Error())); } pSignerCertInfo = Marshal.AllocHGlobal(cbSignerCertInfo); bResult = Win32.CryptMsgGetParam( hMsg, Win32.CMSG_SIGNER_CERT_INFO_PARAM, 0, pSignerCertInfo, ref cbSignerCertInfo ); if (!bResult) { throw new Exception("CryptMsgGetParam error #" + Marshal.GetLastWin32Error().ToString(), new Win32Exception(Marshal.GetLastWin32Error())); } // Open a cert store in memory with the certs from the message hStore = Win32.CertOpenStore( Win32.CERT_STORE_PROV_MSG, Win32.X509_ASN_ENCODING | Win32.PKCS_7_ASN_ENCODING, IntPtr.Zero, 0, hMsg ); if (hStore.Equals(IntPtr.Zero)) { throw new Exception("CertOpenStore error #" + Marshal.GetLastWin32Error().ToString(), new Win32Exception(Marshal.GetLastWin32Error())); } // Find the signer's cert in the store pSignerCertContext = Win32.CertGetSubjectCertificateFromStore( hStore, Win32.X509_ASN_ENCODING | Win32.PKCS_7_ASN_ENCODING, pSignerCertInfo ); if (pSignerCertContext.Equals(IntPtr.Zero)) { throw new Exception("CertGetSubjectCertificateFromStore error #" + Marshal.GetLastWin32Error().ToString(), new Win32Exception(Marshal.GetLastWin32Error())); } // Set message for verifying SignerCertContext = (Win32.CERT_CONTEXT)Marshal.PtrToStructure(pSignerCertContext, typeof(Win32.CERT_CONTEXT)); bResult = Win32.CryptMsgControl( hMsg, 0, Win32.CMSG_CTRL_VERIFY_SIGNATURE, SignerCertContext.pCertInfo ); if (!bResult) { throw new Exception("CryptMsgControl error #" + Marshal.GetLastWin32Error().ToString(), new Win32Exception(Marshal.GetLastWin32Error())); } } finally { // Clean up if (gchandle.IsAllocated) { gchandle.Free(); } if (!pSignerCertContext.Equals(IntPtr.Zero)) { Win32.CertFreeCertificateContext(pSignerCertContext); } if (!pSignerCertInfo.Equals(IntPtr.Zero)) { Marshal.FreeHGlobal(pSignerCertInfo); } if (!hStore.Equals(IntPtr.Zero)) { Win32.CertCloseStore(hStore, Win32.CERT_CLOSE_STORE_FORCE_FLAG); } if (stream != null) { stream.Close(); } if (m_callbackFile != null) { m_callbackFile.Close(); } if (!hMsg.Equals(IntPtr.Zero)) { Win32.CryptMsgClose(hMsg); } } } } } and using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Runtime.InteropServices; using System.Security.Cryptography.X509Certificates; using System.ComponentModel; using System.Security.Cryptography; namespace LargeCMS { class Win32 { #region "CONSTS" public const int X509_ASN_ENCODING = 0x00000001; public const int PKCS_7_ASN_ENCODING = 0x00010000; public const int CMSG_SIGNED = 2; public const int CMSG_DETACHED_FLAG = 0x00000004; public const int AT_KEYEXCHANGE = 1; public const int AT_SIGNATURE = 2; public const String szOID_OIWSEC_sha1 = "1.3.14.3.2.26"; public const int CMSG_CTRL_VERIFY_SIGNATURE = 1; public const int CMSG_CERT_PARAM = 12; public const int CMSG_SIGNER_CERT_INFO_PARAM = 7; public const int CERT_STORE_PROV_MSG = 1; public const int CERT_CLOSE_STORE_FORCE_FLAG = 1; #endregion #region "STRUCTS" [StructLayout(LayoutKind.Sequential)] public struct CRYPT_ALGORITHM_IDENTIFIER { public String pszObjId; BLOB Parameters; } [StructLayout(LayoutKind.Sequential)] public struct CERT_ID { public int dwIdChoice; public BLOB IssuerSerialNumberOrKeyIdOrHashId; } [StructLayout(LayoutKind.Sequential)] public struct CMSG_SIGNER_ENCODE_INFO { public int cbSize; public IntPtr pCertInfo; public IntPtr hCryptProvOrhNCryptKey; public int dwKeySpec; public CRYPT_ALGORITHM_IDENTIFIER HashAlgorithm; public IntPtr pvHashAuxInfo; public int cAuthAttr; public IntPtr rgAuthAttr; public int cUnauthAttr; public IntPtr rgUnauthAttr; public CERT_ID SignerId; public CRYPT_ALGORITHM_IDENTIFIER HashEncryptionAlgorithm; public IntPtr pvHashEncryptionAuxInfo; } [StructLayout(LayoutKind.Sequential)] public struct CERT_CONTEXT { public int dwCertEncodingType; public IntPtr pbCertEncoded; public int cbCertEncoded; public IntPtr pCertInfo; public IntPtr hCertStore; } [StructLayout(LayoutKind.Sequential)] public struct BLOB { public int cbData; public IntPtr pbData; } [StructLayout(LayoutKind.Sequential)] public struct CMSG_SIGNED_ENCODE_INFO { public int cbSize; public int cSigners; public IntPtr rgSigners; public int cCertEncoded; public IntPtr rgCertEncoded; public int cCrlEncoded; public IntPtr rgCrlEncoded; public int cAttrCertEncoded; public IntPtr rgAttrCertEncoded; } [StructLayout(LayoutKind.Sequential)] public struct CMSG_STREAM_INFO { public int cbContent; public StreamOutputCallbackDelegate pfnStreamOutput; public IntPtr pvArg; } #endregion #region "DELEGATES" public delegate Boolean StreamOutputCallbackDelegate(IntPtr pvArg, IntPtr pbData, int cbData, Boolean fFinal); #endregion #region "API" [DllImport("advapi32.dll", CharSet = CharSet.Auto, SetLastError = true)] public static extern Boolean CryptAcquireContext( ref IntPtr hProv, String pszContainer, String pszProvider, int dwProvType, int dwFlags ); [DllImport("Crypt32.dll", SetLastError = true)] public static extern IntPtr CryptMsgOpenToEncode( int dwMsgEncodingType, int dwFlags, int dwMsgType, ref CMSG_SIGNED_ENCODE_INFO pvMsgEncodeInfo, String pszInnerContentObjID, ref CMSG_STREAM_INFO pStreamInfo ); [DllImport("Crypt32.dll", SetLastError = true)] public static extern IntPtr CryptMsgOpenToDecode( int dwMsgEncodingType, int dwFlags, int dwMsgType, IntPtr hCryptProv, IntPtr pRecipientInfo, ref CMSG_STREAM_INFO pStreamInfo ); [DllImport("Crypt32.dll", SetLastError = true)] public static extern Boolean CryptMsgClose( IntPtr hCryptMsg ); [DllImport("Crypt32.dll", SetLastError = true)] public static extern Boolean CryptMsgUpdate( IntPtr hCryptMsg, Byte[] pbData, int cbData, Boolean fFinal ); [DllImport("Crypt32.dll", SetLastError = true)] public static extern Boolean CryptMsgUpdate( IntPtr hCryptMsg, IntPtr pbData, int cbData, Boolean fFinal ); [DllImport("Crypt32.dll", SetLastError = true)] public static extern Boolean CryptMsgGetParam( IntPtr hCryptMsg, int dwParamType, int dwIndex, IntPtr pvData, ref int pcbData ); [DllImport("Crypt32.dll", SetLastError = true)] public static extern Boolean CryptMsgControl( IntPtr hCryptMsg, int dwFlags, int dwCtrlType, IntPtr pvCtrlPara ); [DllImport("advapi32.dll", SetLastError = true)] public static extern Boolean CryptReleaseContext( IntPtr hProv, int dwFlags ); [DllImport("Crypt32.dll", SetLastError = true)] public static extern IntPtr CertCreateCertificateContext( int dwCertEncodingType, IntPtr pbCertEncoded, int cbCertEncoded ); [DllImport("Crypt32.dll", SetLastError = true)] public static extern Boolean CertFreeCertificateContext( IntPtr pCertContext ); [DllImport("Crypt32.dll", SetLastError = true)] public static extern IntPtr CertOpenStore( int lpszStoreProvider, int dwMsgAndCertEncodingType, IntPtr hCryptProv, int dwFlags, IntPtr pvPara ); [DllImport("Crypt32.dll", SetLastError = true)] public static extern IntPtr CertGetSubjectCertificateFromStore( IntPtr hCertStore, int dwCertEncodingType, IntPtr pCertId ); [DllImport("Crypt32.dll", SetLastError = true)] public static extern IntPtr CertCloseStore( IntPtr hCertStore, int dwFlags ); #endregion } }

    Read the article

  • client problems - misaligned expectations & not following SDLC protocols

    - by louism
    hi guys, im having some serious problems with a client on a project - i could use some advice please the short version i have been working with this client now for almost 6 months without any problems (a classified website project in the range of 500 hours) over the last few days things have drastically deteriorated to the point where ive had to place the project on-hold whilst i work-out what to do (this has pissed the client off even more) to be simplistic, the root cause of the issue is this: the client doesnt read the specs i make for him, i code the feature, he than wants to change things, i tell him its not to the agreed spec and that that change will have to be postponed and possibly charged for, he gets upset and rants saying 'hes paid for the feature' and im not keeping to the agreement (<- misalignment of expectations) i think the root cause of the root cause is my clients failure to take my SDLC protocols seriously. i have a bug tracking system in place which he practically refuses to use (he still emails me bugs), he doesnt seem to care to much for the protocols i use for dealing with scope creep and change control the whole situation came to a head recently where he 'cracked it' (an aussie term for being fed-up). the more terms like 'postponed for post-launch implementation', 'costed feature addition', and 'not to agreed spec' i kept using, the worse it got finally, he began to bully me - basically insisting i shut-up and do the work im being paid for. i wrote a long-winded email explaining how wrong he was on all these different points, and explaining what all the SDLC protocols do to protect the success of the project. than i deleted that email and wrote a new one in the new email, i suggested as a solution i write up a list of grievances we both had. we than review the list and compromise on different points: he gets some things he wants, i get some things i want. sometimes youve got to give ground to get ground his response to this suggestion was flat-out refusal, and a restatement that i should just get on with the work ive been paid to do so there you have the very subjective short version. if you have the time and inclination, the long version may be a little less bias as it has the email communiques between me and my client the long version (with background) the long version works by me showing you the email communiques which lead to the situation coming to a head. so here it is, judge for yourself where the trouble started... 1. client asked me why something was missing from a feature i just uploaded, my response was to show him what was in the spec: it basically said the item he was looking for was never going to be included 2. [clients response...] Memo Louis, We are following your own title fields and keeping a consistent layout. Why the big fuss about not adding "Part". It simply replaces "model" and is consistent with your current title fields. 3. [my response...] hi [client], the 'part' field appeared to me as a redundancy / mistake. i requested clarification but never received any in a timely manner (about 2 weeks ago) the specification for this feature also indicated it wasnt going to be included: RE: "Why the big fuss about not adding "Part" " it may not appear so, but it would actually be a lot of work for me to now add a 'Part' field it could take me up to 15-20 minutes to properly explain why its such a big undertaking to do this, but i would prefer to use that time instead to work on completing your v1.1 features as a simplistic explanation - it connects to the change in paradigm from a 'generic classified ad' model to a 'specific attributes for specific categories' model basically, i am saying it is a big fuss, but i understand that it doesnt look that way - after all, it is just one ity-bitty field :) if you require a fuller explanation, please let me know and i will commit the time needed to write that out also, if you recall when we first started on the project, i said that with the effort/time required for features, you would likely not know off the top of your head. you may think something is really complex, but in reality its quite simple, you might think something is easy - but it could actually be a massive trauma to code (which is the case here with the 'Part' field). if you also recalled, i said the best course of action is to just ask, and i would let you know on a case-by-case basis 4. [email from me to client...] hi [client], the online catalogue page is now up live (see my email from a few days ago for information on how it works) note: the window of opportunity for input/revisions on what data the catalogue stores has now closed (as i have put the code up live now) RE: the UI/layout of the online catalogue page you may still do visual/ui tweaks to the page at the moment (this window for input/revisions will close in a couple of days time) 5. [email from client to me...] *(note: i had put up the feature & asked the client to review it, never heard back from them for a few days)* Memo Louis, Here you go again. CLOSED without a word of input from the customer. I don't think so. I will reply tomorrow regarding the content and functionality we require from this feature. 5. [from me to client...] hi [client]: RE: from my understanding, you are saying that the mini-sale yard control would change itself based on the fact someone was viewing for parts & accessories <- is that correct? this change is outside the scope of the v1.1 mini-spec and therefore will need to wait 'til post launch for costing/implementation 6. [email from client to me...] Memo Louis, Following your v1.1 mini-spec and all your time paid in full for the work selected. We need to make the situation clear. There will be no further items held for post-launch. Do not expect us to pay for any further items other than those we have agreed upon. You have undertaken to complete the Parts and accessories feature as follows. Obviously, as part of this process the "mini search" will be effected, and will require "adaption to make sense". 7. [email from me to client...] hi [client], RE: "There will be no further items held for post-launch. Do not expect us to pay for any further items other than those we have agreed upon." a few points to consider: 1) the specification for the 'parts & accessories' feature was as follows: (i.e. [what] "...we have agreed upon.") 2) you have received the 'parts & accessories' feature free of charge (you have paid $0 for it). ive spent two days coding that feature as a gesture of good will i would request that you please consider these two facts carefully and sincerely 8. [email from client to me...] Memo Louis, I don't see how you are giving us anything for free. From your original fee proposal you have deleted more than 30 hours of included features. Your title "shelved features". Further you have charged us twice by adding back into the site, at an addition cost, some of those "shelved features" features. See v1.1 mini-spec. Did include in your original fee proposal a change request budget but then charge without discussion items included in v1.1 mini-spec. Included a further Features test plan for a regression test, a fee of 10 hours that would not have been required if the "shelved features" were not left out of the agreed fee proposal. I have made every attempt to satisfy your your uneven business sense by offering you everything your heart desired, in the v1.1 mini-spec, to be left once again with your attitude of "its too hard, lets leave it for post launch". I am no longer accepting anything less than what we have contracted you to do. That is clearly defined in v1.1 mini-spec, and you are paid in advance for delivering those items as an acceptable function. a few notes about the above email... i had to cull features from the original spec because it didnt fit into the budget. i explained this to the client at the start of the project (he wanted more features than he had budget hours to do them all) nothing has been charged for twice, i didnt charge the client for culled features. im charging him to now do those culled features the draft version of the project schedule included a change request budget of 10 hours, but i had to remove that to meet the budget (the client may not have been aware of this to be fair to them) what the client refers to as my attitude of 'too hard/leave it for post-launch', i called a change request protocol and a method for keeping scope creep under control 9. [email from me to client...] hi [client], RE: "...all your grievances..." i had originally written out a long email response; it was fantastic, it had all these great points of how 'you were wrong' and 'i was right', you would of loved it (and by 'loved it', i mean it would of just infuriated you more) so, i decided to deleted it start over, for two reasons: 1) a long email is being disrespectful of your time (youre a busy businessman with things to do) 2) whos wrong or right gets us no closer to fixing the problems we are experiencing what i propose is this... i prepare a bullet point list of your grievances and my grievances (yes, im unhappy too about how things are going - and it has little to do with money) i submit this list to you for you to add to as necessary we then both take a good hard look at this list, and we decide which areas we are willing to give ground on as an example, the list may look something like this: "louis, you keep taking away features you said you would do" [your grievance 2] [your grievance 3] [your grievance ...] "[client], i feel you dont properly read the specs i prepare for you..." [my grievance 2] [my grievance 3] [my grievance ...] if you are willing to give this a try, let me know will it work? who knows. but if it doesnt, we can always go back to arguing some more :) obviously, this will only work if you are willing to give it a genuine try, and you can accept that you may have to 'give some ground to get some ground' what do you think? 10. [email from client to me ...] Memo Louis, Instead of wasting your time listing grievances, I would prefer you complete the items in v1.1 mini-spec, to a satisfactory conclusion. We almost had the website ready for launch until you brought the v1.1 mini-spec into the frame. Obviously I expected you could complete the v1.1 mini-spec in a two-week time frame as you indicated and give the site a more profession presentation. Most of the problems have been caused by you not following our instructions, but deciding to do what you feel like at the time. And then arguing with us how the missing information is not necessary. For instance "Parts and Accessories". Why on earth would you leave out the parts heading, when it ties-in with the fields you have already developed. It replaces "model" and is just as important in the context of information that appears in the "Details" panel. We are at a stage where the the v1.1 mini-spec needs to be completed without further time wasting and the site is complete (subject to all features working). We are on standby at this end to do just that. Let me know when you are back, working on the site and we will process and complete each v1.1 mini-spec, item by item, until the job is complete. 11. [last email from me to client...] hi [client], based on this reply, and your demonstrated unwillingness to compromise/give any ground on issues at hand, i have decided to place your project on-hold for the moment i will be considering further options on how to over-come our challenges over the next few days i will contact you by monday 17/may to discuss any new options i have come up with, and if i believe it is appropriate to restart work on your project at that point or not told you it was long... what do you think?

    Read the article

  • Bulk inserting best way to about it? + Helping me understand fully what I found so far

    - by chobo2
    Hi So I saw this post here and read it and it seems like bulk copy might be the way to go. http://stackoverflow.com/questions/682015/whats-the-best-way-to-bulk-database-inserts-from-c I still have some questions and want to know how things actually work. So I found 2 tutorials. http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx First way uses 2 ado.net 2.0 features. BulkInsert and BulkCopy. the second one uses linq to sql and OpenXML. This sort of appeals to me as I am using linq to sql already and prefer it over ado.net. However as one person pointed out in the posts what he just going around the issue at the cost of performance( nothing wrong with that in my opinion) First I will talk about the 2 ways in the first tutorial I am using VS2010 Express, .net 4.0, MVC 2.0, SQl Server 2005 Is ado.net 2.0 the most current version? Based on the technology I am using, is there some updates to what I am going to show that would improve it somehow? Is there any thing that these tutorial left out that I should know about? BulkInsert I am using this table for all the examples. CREATE TABLE [dbo].[TBL_TEST_TEST] ( ID INT IDENTITY(1,1) PRIMARY KEY, [NAME] [varchar](50) ) SP Code USE [Test] GO /****** Object: StoredProcedure [dbo].[sp_BatchInsert] Script Date: 05/19/2010 15:12:47 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[sp_BatchInsert] (@Name VARCHAR(50) ) AS BEGIN INSERT INTO TBL_TEST_TEST VALUES (@Name); END C# Code /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 1000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } So first thing is the batch size. Why would you set a batch size to anything but the number of records you are sending? Like I am sending 500,000 records so I did a Batch size of 500,000. Next why does it crash when I do this? If I set it to 1000 for batch size it works just fine. System.Data.SqlClient.SqlException was unhandled Message="A transport-level error has occurred when sending the request to the server. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)" Source=".Net SqlClient Data Provider" ErrorCode=-2146232060 Class=20 LineNumber=0 Number=233 Server="" State=0 StackTrace: at System.Data.Common.DbDataAdapter.UpdatedRowStatusErrors(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.UpdatedRowStatus(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.Update(DataRow[] dataRows, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.UpdateFromDataTable(DataTable dataTable, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.Update(DataTable dataTable) at TestIQueryable.Program.BatchInsert() in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 124 at TestIQueryable.Program.Main(String[] args) in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 16 InnerException: Time it took to insert 500,000 records with insert batch size of 1000 took "2 mins and 54 seconds" Of course this is no official time I sat there with a stop watch( I am sure there are better ways but was too lazy to look what they where) So I find that kinda slow compared to all my other ones(expect the linq to sql insert one) and I am not really sure why. Next I looked at bulkcopy /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } This one seemed to go really fast and did not even need a SP( can you use SP with bulk copy? If you can would it be better?) BatchCopy had no problem with a 500,000 batch size.So again why make it smaller then the number of records you want to send? I found that with BatchCopy and 500,000 batch size it took only 5 seconds to complete. I then tried with a batch size of 1,000 and it only took 8 seconds. So much faster then the bulkinsert one above. Now I tried the other tutorial. USE [Test] GO /****** Object: StoredProcedure [dbo].[spTEST_InsertXMLTEST_TEST] Script Date: 05/19/2010 15:39:03 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[spTEST_InsertXMLTEST_TEST](@UpdatedProdData nText) AS DECLARE @hDoc int exec sp_xml_preparedocument @hDoc OUTPUT,@UpdatedProdData INSERT INTO TBL_TEST_TEST(NAME) SELECT XMLProdTable.NAME FROM OPENXML(@hDoc, 'ArrayOfTBL_TEST_TEST/TBL_TEST_TEST', 2) WITH ( ID Int, NAME varchar(100) ) XMLProdTable EXEC sp_xml_removedocument @hDoc C# code. /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } So I like this because I get to use objects even though it is kinda redundant. I don't get how the SP works. Like I don't get the whole thing. I don't know if OPENXML has some batch insert under the hood but I do not even know how to take this example SP and change it to fit my tables since like I said I don't know what is going on. I also don't know what would happen if the object you have more tables in it. Like say I have a ProductName table what has a relationship to a Product table or something like that. In linq to sql you could get the product name object and make changes to the Product table in that same object. So I am not sure how to take that into account. I am not sure if I would have to do separate inserts or what. The time was pretty good for 500,000 records it took 52 seconds The last way of course was just using linq to do it all and it was pretty bad. /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } I did only 50,000 records and that took over a minute to do. So I really narrowed it done to the linq to sql bulk insert way or bulk copy. I am just not sure how to do it when you have relationship for either way. I am not sure how they both stand up when doing updates instead of inserts as I have not gotten around to try it yet. I don't think I will ever need to insert/update more than 50,000 records at one type but at the same time I know I will have to do validation on records before inserting so that will slow it down and that sort of makes linq to sql nicer as your got objects especially if your first parsing data from a xml file before you insert into the database. Full C# code using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Xml.Serialization; using System.Data; using System.Data.SqlClient; namespace TestIQueryable { class Program { private static string connectionString = ""; static void Main(string[] args) { BatchInsert(); Console.WriteLine("done"); } /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 500000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } private static DataTable GetDataTable() { // You First need a DataTable and have all the insert values in it DataTable dtInsertRows = new DataTable(); dtInsertRows.Columns.Add("NAME"); for (int i = 0; i < 500000; i++) { DataRow drInsertRow = dtInsertRows.NewRow(); string name = "Name : " + i; drInsertRow["NAME"] = name; dtInsertRows.Rows.Add(drInsertRow); } return dtInsertRows; } static void sbc_SqlRowsCopied(object sender, SqlRowsCopiedEventArgs e) { Console.WriteLine("Number of records affected : " + e.RowsCopied.ToString()); } } }

    Read the article

  • Xml failing to deserialise

    - by Carnotaurus
    I call a method to get my pages [see GetPages(String xmlFullFilePath)]. The FromXElement method is supposed to deserialise the LitePropertyData elements to strongly type LitePropertyData objects. Instead it fails on the following line: return (T)xmlSerializer.Deserialize(memoryStream); and gives the following error: <LitePropertyData xmlns=''> was not expected. What am I doing wrong? I have included the methods that I call and the xml data: public static T FromXElement<T>(this XElement xElement) { using (var memoryStream = new MemoryStream(Encoding.ASCII.GetBytes(xElement.ToString()))) { var xmlSerializer = new XmlSerializer(typeof(T)); return (T)xmlSerializer.Deserialize(memoryStream); } } public static List<LitePageData> GetPages(String xmlFullFilePath) { XDocument document = XDocument.Load(xmlFullFilePath); List<LitePageData> results = (from record in document.Descendants("row") select new LitePageData { Guid = IsValid(record, "Guid") ? record.Element("Guid").Value : null, ParentID = IsValid(record, "ParentID") ? Convert.ToInt32(record.Element("ParentID").Value) : (Int32?)null, Created = Convert.ToDateTime(record.Element("Created").Value), Changed = Convert.ToDateTime(record.Element("Changed").Value), Name = record.Element("Name").Value, ID = Convert.ToInt32(record.Element("ID").Value), LitePageTypeID = IsValid(record, "ParentID") ? Convert.ToInt32(record.Element("ParentID").Value) : (Int32?)null, Html = record.Element("Html").Value, FriendlyName = record.Element("FriendlyName").Value, Properties = record.Element("Properties") != null ? record.Element("Properties").Element("LitePropertyData").FromXElement<List<LitePropertyData>>() : new List<LitePropertyData>() }).ToList(); return results; } Here is the xml: <?xml version="1.0" encoding="utf-8"?> <root> <rows> <row> <ID>1</ID> <ImageUrl></ImageUrl> <Html>Home page</Html> <Created>01-01-2012</Created> <Changed>01-01-2012</Changed> <Name>Home page</Name> <FriendlyName>home-page</FriendlyName> </row> <row xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <Guid>edeaf468-f490-4271-bf4d-be145bc6a1fd</Guid> <ID>8</ID> <Name>Unused</Name> <ParentID>1</ParentID> <Created>2006-03-25T10:57:17</Created> <Changed>2012-07-17T12:24:30.0984747+01:00</Changed> <ChangedBy /> <LitePageTypeID xsi:nil="true" /> <Html> What is the purpose of this option? This option checks the current document for accessibility issues. It uses Bobby to provide details of whether the current web page conforms to W3C's WCAG criteria for web content accessibility. Issues with Bobby and Cynthia Bobby and Cynthia are free services that supposedly allow a user to expose web page accessibility barriers. It is something of a guide but perhaps a blunt instrument. I tested a few of the webpages that I have designed. Sure enough, my pages fall short and for good reason. I am not about to claim that Bobby and Cynthia are useless. Although it is useful and commendable tool, it project appears to be overly ambitious. Nevertheless, let me explain my issues with Bobby and Cynthia: First, certain W3C standards for designing web documents are often too strict and unworkable. For instance, in some versions W3C standards for HTML, certain tags should not include a particular attribute, whereas in others they are requisite if the document is to be ???well-formed???. The standard that a designer chooses is determined usually by the requirements specification document. This specifies which browsers and versions of those browsers that the web page is expected to correctly display. Forcing a hypertext document to conform strictly to a specific W3C standard for HTML is often no simple task. In the worst case, it cannot conform without losing some aesthetics or accessibility functionality. Second, the case of HTML documents is not an isolated case. Standards for XML, XSL, JavaScript, VBScript, are analogous. Therefore, you might imagine the problems when you begin to combine these languages and formats in an HTML document. Third, there is always more than one way to skin a cat. For example, Bobby and Cynthia may flag those IMG tags that do not contain a TITLE attribute. There might be good reason that a web developer chooses not to include the title attribute. The title attribute has a limited numbers of characters and does not support carriage returns. This is a major defect in the design of this tag. In fact, before the TITLE attribute was supported, there was the ALT attribute. Most browsers support both, yet they both perform a similar function. However, both attributes share the same deficiencies. In practice, there are instances where neither attribute would be used. Instead, for example, the developer would write some JavaScript or VBScript to circumvent these deficiencies. The concern is that Bobby and Cynthia would not notice this because it does not ???understand??? what the JavaScript does. </Html> <FriendlyName>unused</FriendlyName> <IsDeleted>false</IsDeleted> <Properties> <LitePropertyData> <Description>Image for the page</Description> <DisplayEditUI>true</DisplayEditUI> <OwnerTab>1</OwnerTab> <DisplayName>Image Url</DisplayName> <FieldOrder>1</FieldOrder> <IsRequired>false</IsRequired> <Name>ImageUrl</Name> <IsModified>false</IsModified> <ParentPageID>3</ParentPageID> <Type>String</Type> <Value xsi:type="xsd:string">smarter.jpg</Value> </LitePropertyData> <LitePropertyData> <Description>WebItemApplicationEnum</Description> <DisplayEditUI>true</DisplayEditUI> <OwnerTab>1</OwnerTab> <DisplayName>WebItemApplicationEnum</DisplayName> <FieldOrder>1</FieldOrder> <IsRequired>false</IsRequired> <Name>WebItemApplicationEnum</Name> <IsModified>false</IsModified> <ParentPageID>3</ParentPageID> <Type>Number</Type> <Value xsi:type="xsd:string">1</Value> </LitePropertyData> </Properties> <Seo> <Author>Phil Carney</Author> <Classification /> <Copyright>Carnotaurus</Copyright> <Description> What is the purpose of this option? This option checks the current document for accessibility issues. It uses Bobby to provide details of whether the current web page conforms to W3C's WCAG criteria for web content accessibility. Issues with Bobby and Cynthia Bobby and Cynthia are free services that supposedly allow a user to expose web page accessibility barriers. It is something of a guide but perhaps a blunt instrument. I tested a few of the webpages that I have designed. Sure enough, my pages fall short and for good reason. I am not about to claim that Bobby and Cynthia are useless. Although it is useful and commendable tool, it project appears to be overly ambitious. Nevertheless, let me explain my issues with Bobby and Cynthia: First, certain W3C standards for designing web documents are often too strict and unworkable. For instance, in some versions W3C standards for HTML, certain tags should not include a particular attribute, whereas in others they are requisite if the document is to be ???well-formed???. The standard that a designer chooses is determined usually by the requirements specification document. This specifies which browsers and versions of those browsers that the web page is expected to correctly display. Forcing a hypertext document to conform strictly to a specific W3C standard for HTML is often no simple task. In the worst case, it cannot conform without losing some aesthetics or accessibility functionality. Second, the case of HTML documents is not an isolated case. Standards for XML, XSL, JavaScript, VBScript, are analogous. Therefore, you might imagine the problems when you begin to combine these languages and formats in an HTML document. Third, there is always more than one way to skin a cat. For example, Bobby and Cynthia may flag those IMG tags that do not contain a TITLE attribute. There might be good reason that a web developer chooses not to include the title attribute. The title attribute has a limited numbers of characters and does not support carriage returns. This is a major defect in the design of this tag. In fact, before the TITLE attribute was supported, there was the ALT attribute. Most browsers support both, yet they both perform a similar function. However, both attributes share the same deficiencies. In practice, there are instances where neither attribute would be used. Instead, for example, the developer would write some JavaScript or VBScript to circumvent these deficiencies. The concern is that Bobby and Cynthia would not notice this because it does not ???understand??? what the JavaScript does. </Description> <Keywords>unused</Keywords> <Title>unused</Title> </Seo> </row> </rows> </root> EDIT Here are my entities: public class LitePropertyData { public virtual string Description { get; set; } public virtual bool DisplayEditUI { get; set; } public int OwnerTab { get; set; } public virtual string DisplayName { get; set; } public int FieldOrder { get; set; } public bool IsRequired { get; set; } public string Name { get; set; } public virtual bool IsModified { get; set; } public virtual int ParentPageID { get; set; } public LiteDataType Type { get; set; } public object Value { get; set; } } [Serializable] public class LitePageData { public String Guid { get; set; } public Int32 ID { get; set; } public String Name { get; set; } public Int32? ParentID { get; set; } public DateTime Created { get; set; } public String CreatedBy { get; set; } public DateTime Changed { get; set; } public String ChangedBy { get; set; } public Int32? LitePageTypeID { get; set; } public String Html { get; set; } public String FriendlyName { get; set; } public Boolean IsDeleted { get; set; } public List<LitePropertyData> Properties { get; set; } public LiteSeoPageData Seo { get; set; } /// <summary> /// Saves the specified XML full file path. /// </summary> /// <param name="xmlFullFilePath">The XML full file path.</param> public void Save(String xmlFullFilePath) { XDocument doc = XDocument.Load(xmlFullFilePath); XElement demoNode = this.ToXElement<LitePageData>(); demoNode.Name = "row"; doc.Descendants("rows").Single().Add(demoNode); doc.Save(xmlFullFilePath); } }

    Read the article

  • How to edit item in a listbox shown from reading a .csv file?

    - by Shuvo
    I am working in a project where my application can open a .csv file and read data from it. The .csv file contains the latitude, longitude of places. The application reads data from the file shows it in a static map and display icon on the right places. The application can open multiple file at a time and it opens with a new tab every time. But I am having trouble in couple of cases When I am trying to add a new point to the .csv file opened. I am able to write new point on the same file instead adding a new point data to the existing its replacing others and writing the new point only. I cannot use selectedIndexChange event to perform edit option on the listbox and then save the file. Any direction would be great. using System; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Drawing; using System.Linq; using System.Text; using System.Windows.Forms; using System.IO; namespace CourseworkExample { public partial class Form1 : Form { public GPSDataPoint gpsdp; List<GPSDataPoint> data; List<PictureBox> pictures; List<TabPage> tabs; public static int pn = 0; private TabPage currentComponent; private Bitmap bmp1; string[] symbols = { "hospital", "university" }; Image[] symbolImages; ListBox lb = new ListBox(); string name = ""; string path = ""; public Form1() { InitializeComponent(); data = new List<GPSDataPoint>(); pictures = new List<PictureBox>(); tabs = new List<TabPage>(); symbolImages = new Image[symbols.Length]; for (int i = 0; i < 2; i++) { string location = "data/" + symbols[i] + ".png"; symbolImages[i] = Image.FromFile(location); } } private void openToolStripMenuItem_Click(object sender, EventArgs e) { FileDialog ofd = new OpenFileDialog(); string filter = "CSV File (*.csv)|*.csv"; ofd.Filter = filter; DialogResult dr = ofd.ShowDialog(); if (dr.Equals(DialogResult.OK)) { int i = ofd.FileName.LastIndexOf("\\"); name = ofd.FileName; path = ofd.FileName; if (i > 0) { name = ofd.FileName.Substring(i + 1); path = ofd.FileName.Substring(0, i + 1); } TextReader input = new StreamReader(ofd.FileName); string mapName = input.ReadLine(); GPSDataPoint gpsD = new GPSDataPoint(); gpsD.setBounds(input.ReadLine()); string s; while ((s = input.ReadLine()) != null) { gpsD.addWaypoint(s); } input.Close(); TabPage tabPage = new TabPage(); tabPage.Location = new System.Drawing.Point(4, 22); tabPage.Name = "tabPage" + pn; lb.Width = 300; int selectedindex = lb.SelectedIndex; lb.Items.Add(mapName); lb.Items.Add("Bounds"); lb.Items.Add(gpsD.Bounds[0] + " " + gpsD.Bounds[1] + " " + gpsD.Bounds[2] + " " + gpsD.Bounds[3]); lb.Items.Add("Waypoint"); foreach (WayPoint wp in gpsD.DataList) { lb.Items.Add(wp.Name + " " + wp.Latitude + " " + wp.Longitude + " " + wp.Ele + " " + wp.Sym); } tabPage.Controls.Add(lb); pn++; tabPage.Padding = new System.Windows.Forms.Padding(3); tabPage.Size = new System.Drawing.Size(192, 74); tabPage.TabIndex = 0; tabPage.Text = name; tabPage.UseVisualStyleBackColor = true; tabs.Add(tabPage); tabControl1.Controls.Add(tabPage); tabPage = new TabPage(); tabPage.Location = new System.Drawing.Point(4, 22); tabPage.Name = "tabPage" + pn; pn++; tabPage.Padding = new System.Windows.Forms.Padding(3); tabPage.Size = new System.Drawing.Size(192, 74); tabPage.TabIndex = 0; tabPage.Text = mapName; string location = path + mapName; tabPage.UseVisualStyleBackColor = true; tabs.Add(tabPage); PictureBox pb = new PictureBox(); pb.Name = "pictureBox" + pn; pb.Image = Image.FromFile(location); tabControl2.Controls.Add(tabPage); pb.Width = pb.Image.Width; pb.Height = pb.Image.Height; tabPage.Controls.Add(pb); currentComponent = tabPage; tabPage.Width = pb.Width; tabPage.Height = pb.Height; pn++; tabControl2.Width = pb.Width; tabControl2.Height = pb.Height; bmp1 = (Bitmap)pb.Image; int lx, ly; float realWidth = gpsD.Bounds[1] - gpsD.Bounds[3]; float imageW = pb.Image.Width; float dx = imageW * (gpsD.Bounds[1] - gpsD.getWayPoint(0).Longitude) / realWidth; float realHeight = gpsD.Bounds[0] - gpsD.Bounds[2]; float imageH = pb.Image.Height; float dy = imageH * (gpsD.Bounds[0] - gpsD.getWayPoint(0).Latitude) / realHeight; lx = (int)dx; ly = (int)dy; using (Graphics g = Graphics.FromImage(bmp1)) { Rectangle rect = new Rectangle(lx, ly, 20, 20); if (gpsD.getWayPoint(0).Sym.Equals("")) { g.DrawRectangle(new Pen(Color.Red), rect); } else { if (gpsD.getWayPoint(0).Sym.Equals("hospital")) { g.DrawImage(symbolImages[0], rect); } else { if (gpsD.getWayPoint(0).Sym.Equals("university")) { g.DrawImage(symbolImages[1], rect); } } } } pb.Image = bmp1; pb.Invalidate(); } } private void openToolStripMenuItem_Click_1(object sender, EventArgs e) { FileDialog ofd = new OpenFileDialog(); string filter = "CSV File (*.csv)|*.csv"; ofd.Filter = filter; DialogResult dr = ofd.ShowDialog(); if (dr.Equals(DialogResult.OK)) { int i = ofd.FileName.LastIndexOf("\\"); name = ofd.FileName; path = ofd.FileName; if (i > 0) { name = ofd.FileName.Substring(i + 1); path = ofd.FileName.Substring(0, i + 1); } TextReader input = new StreamReader(ofd.FileName); string mapName = input.ReadLine(); GPSDataPoint gpsD = new GPSDataPoint(); gpsD.setBounds(input.ReadLine()); string s; while ((s = input.ReadLine()) != null) { gpsD.addWaypoint(s); } input.Close(); TabPage tabPage = new TabPage(); tabPage.Location = new System.Drawing.Point(4, 22); tabPage.Name = "tabPage" + pn; ListBox lb = new ListBox(); lb.Width = 300; lb.Items.Add(mapName); lb.Items.Add("Bounds"); lb.Items.Add(gpsD.Bounds[0] + " " + gpsD.Bounds[1] + " " + gpsD.Bounds[2] + " " + gpsD.Bounds[3]); lb.Items.Add("Waypoint"); foreach (WayPoint wp in gpsD.DataList) { lb.Items.Add(wp.Name + " " + wp.Latitude + " " + wp.Longitude + " " + wp.Ele + " " + wp.Sym); } tabPage.Controls.Add(lb); pn++; tabPage.Padding = new System.Windows.Forms.Padding(3); tabPage.Size = new System.Drawing.Size(192, 74); tabPage.TabIndex = 0; tabPage.Text = name; tabPage.UseVisualStyleBackColor = true; tabs.Add(tabPage); tabControl1.Controls.Add(tabPage); tabPage = new TabPage(); tabPage.Location = new System.Drawing.Point(4, 22); tabPage.Name = "tabPage" + pn; pn++; tabPage.Padding = new System.Windows.Forms.Padding(3); tabPage.Size = new System.Drawing.Size(192, 74); tabPage.TabIndex = 0; tabPage.Text = mapName; string location = path + mapName; tabPage.UseVisualStyleBackColor = true; tabs.Add(tabPage); PictureBox pb = new PictureBox(); pb.Name = "pictureBox" + pn; pb.Image = Image.FromFile(location); tabControl2.Controls.Add(tabPage); pb.Width = pb.Image.Width; pb.Height = pb.Image.Height; tabPage.Controls.Add(pb); currentComponent = tabPage; tabPage.Width = pb.Width; tabPage.Height = pb.Height; pn++; tabControl2.Width = pb.Width; tabControl2.Height = pb.Height; bmp1 = (Bitmap)pb.Image; int lx, ly; float realWidth = gpsD.Bounds[1] - gpsD.Bounds[3]; float imageW = pb.Image.Width; float dx = imageW * (gpsD.Bounds[1] - gpsD.getWayPoint(0).Longitude) / realWidth; float realHeight = gpsD.Bounds[0] - gpsD.Bounds[2]; float imageH = pb.Image.Height; float dy = imageH * (gpsD.Bounds[0] - gpsD.getWayPoint(0).Latitude) / realHeight; lx = (int)dx; ly = (int)dy; using (Graphics g = Graphics.FromImage(bmp1)) { Rectangle rect = new Rectangle(lx, ly, 20, 20); if (gpsD.getWayPoint(0).Sym.Equals("")) { g.DrawRectangle(new Pen(Color.Red), rect); } else { if (gpsD.getWayPoint(0).Sym.Equals("hospital")) { g.DrawImage(symbolImages[0], rect); } else { if (gpsD.getWayPoint(0).Sym.Equals("university")) { g.DrawImage(symbolImages[1], rect); } } } } pb.Image = bmp1; pb.Invalidate(); MessageBox.Show(data.ToString()); } } private void exitToolStripMenuItem_Click(object sender, EventArgs e) { this.Close(); } private void addBtn_Click(object sender, EventArgs e) { string wayName = nameTxtBox.Text; float wayLat = Convert.ToSingle(latTxtBox.Text); float wayLong = Convert.ToSingle(longTxtBox.Text); float wayEle = Convert.ToSingle(elevTxtBox.Text); WayPoint wp = new WayPoint(wayName, wayLat, wayLong, wayEle); GPSDataPoint gdp = new GPSDataPoint(); data = new List<GPSDataPoint>(); gdp.Add(wp); lb.Items.Add(wp.Name + " " + wp.Latitude + " " + wp.Longitude + " " + wp.Ele + " " + wp.Sym); lb.Refresh(); StreamWriter sr = new StreamWriter(name); sr.Write(lb); sr.Close(); DialogResult result = MessageBox.Show("Save in New File?","Save", MessageBoxButtons.YesNo); if (result == DialogResult.Yes) { SaveFileDialog saveDialog = new SaveFileDialog(); saveDialog.FileName = "default.csv"; DialogResult saveResult = saveDialog.ShowDialog(); if (saveResult == DialogResult.OK) { sr = new StreamWriter(saveDialog.FileName, true); sr.WriteLine(wayName + "," + wayLat + "," + wayLong + "," + wayEle); sr.Close(); } } else { // sr = new StreamWriter(name, true); // sr.WriteLine(wayName + "," + wayLat + "," + wayLong + "," + wayEle); sr.Close(); } MessageBox.Show(name + path); } } } GPSDataPoint.cs using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.IO; namespace CourseworkExample { public class GPSDataPoint { private float[] bounds; private List<WayPoint> dataList; public GPSDataPoint() { dataList = new List<WayPoint>(); } internal void setBounds(string p) { string[] b = p.Split(','); bounds = new float[b.Length]; for (int i = 0; i < b.Length; i++) { bounds[i] = Convert.ToSingle(b[i]); } } public float[] Bounds { get { return bounds; } } internal void addWaypoint(string s) { WayPoint wp = new WayPoint(s); dataList.Add(wp); } public WayPoint getWayPoint(int i) { if (i < dataList.Count) { return dataList[i]; } else return null; } public List<WayPoint> DataList { get { return dataList; } } internal void Add(WayPoint wp) { dataList.Add(wp); } } } WayPoint.cs using System; using System.Collections.Generic; using System.Linq; using System.Text; namespace CourseworkExample { public class WayPoint { private string name; private float ele; private float latitude; private float longitude; private string sym; public WayPoint(string name, float latitude, float longitude, float elevation) { this.name = name; this.latitude = latitude; this.longitude = longitude; this.ele = elevation; } public WayPoint() { name = "no name"; ele = 3.5F; latitude = 3.5F; longitude = 0.0F; sym = "no symbol"; } public WayPoint(string s) { string[] bits = s.Split(','); name = bits[0]; longitude = Convert.ToSingle(bits[2]); latitude = Convert.ToSingle(bits[1]); if (bits.Length > 4) sym = bits[4]; else sym = ""; try { ele = Convert.ToSingle(bits[3]); } catch (Exception e) { ele = 0.0f; } } public float Longitude { get { return longitude; } set { longitude = value; } } public float Latitude { get { return latitude; } set { latitude = value; } } public float Ele { get { return ele; } set { ele = value; } } public string Name { get { return name; } set { name = value; } } public string Sym { get { return sym; } set { sym = value; } } } } .csv file data birthplace.png 51.483788,-0.351906,51.460745,-0.302982 Born Here,51.473805,-0.32532,-,hospital Danced here,51,483805,-0.32532,-,hospital

    Read the article

  • Confused Why I am getting C1010 error?

    - by bluepixel
    I have three files: Main, slist.h and slist.cpp can be seen at http://forums.devarticles.com/c-c-help-52/confused-why-i-am-getting-c2143-and-c1010-error-259574.html I'm trying to make a program where main reads the list of student names from a file (roster.txt) and inserts all the names in a list in ascending order. This is the full class roster list (notCheckedIN). From here I will read all students who have come to write the exams, each checkin will transfer their name to another list (in ascending order) called present. The final product is notCheckedIN will contain a list of all those students that did not write the exam and present will contain the list of all students who wrote the exam Main File: // Exam.cpp : Defines the entry point for the console application. #include "stdafx.h" #include "iostream" #include "iomanip" #include "fstream" #include "string" #include "slist.h" using namespace std; void OpenFile(ifstream&); void GetClassRoster(SortList&, ifstream&); void InputStuName(SortList&, SortList&); void UpdateList(SortList&, SortList&, string); void Print(SortList&, SortList&); const string END_DATA = "EndData"; int main() { ifstream roster; SortList notCheckedIn; //students present SortList present; //student absent OpenFile(roster); if(!roster) //Make sure file is opened return 1; GetClassRoster(notCheckedIn, roster); //insert the roster list into the notCheckedIn list InputStuName(present, notCheckedIn); Print(present, notCheckedIn); return 0; } void OpenFile(ifstream& roster) //Precondition: roster is pointing to file containing student anmes //Postcondition:IF file does not exist -> exit { string fileName = "roster.txt"; roster.open(fileName.c_str()); if(!roster) cout << "***ERROR CANNOT OPEN FILE :"<< fileName << "***" << endl; } void GetClassRoster(SortList& notCheckedIN, ifstream& roster) //Precondition:roster points to file containing list of student last name // && notCheckedIN is empty //Postcondition:notCheckedIN is filled with the names taken from roster.txt in ascending order { string name; roster >> name; while(roster) { notCheckedIN.Insert(name); roster >> name; } } void InputStuName(SortList& present, SortList& notCheckedIN) //Precondition: present list is empty initially and notCheckedIN list is full //Postcondition: repeated prompting to enter stuName // && notCheckedIN will delete all names found in present // && present will contain names present // && names not found in notCheckedIN will report Error { string stuName; cout << "Enter last name (Enter EndData if none to Enter): "; cin >> stuName; while(stuName!=END_DATA) { UpdateList(present, notCheckedIN, stuName); } } void UpdateList(SortList& present, SortList& notCheckedIN, string stuName) //Precondition:stuName is assigned //Postcondition:IF stuName is present, stuName is inserted in present list // && stuName is removed from the notCheckedIN list // ELSE stuName does not exist { if(notCheckedIN.isPresent(stuName)) { present.Insert(stuName); notCheckedIN.Delete(stuName); } else cout << "NAME IS NOT PRESENT" << endl; } void Print(SortList& present, SortList& notCheckedIN) //Precondition: present and notCheckedIN contains a list of student Names present/not present //Postcondition: content of present and notCheckedIN is printed { cout << "Candidates Present" << endl; present.Print(); cout << "Candidates Absent" << endl; notCheckedIN.Print(); } Header File: //Specification File: slist.h //This file gives the specifications of a list abstract data type //List items inserted will be in order //Class SortList, structured type used to represent an ADT using namespace std; const int MAX_LENGTH = 200; typedef string ItemType; //Class Object (class instance) SortList. Variable of class type. class SortList { //Class Member - components of a class, can be either data or functions public: //Constructor //Post-condition: Empty list is created SortList(); //Const member function. Compiler error occurs if any statement within tries to modify a private data bool isEmpty() const; //Post-condition: == true if list is empty // == false if list is not empty bool isFull() const; //Post-condition: == true if list is full // == false if list is full int Length() const; //Post-condition: size of list void Insert(ItemType item); //Precondition: NOT isFull() && item is assigned //Postcondition: item is in list && Length() = Length()@entry + 1 void Delete(ItemType item); //Precondition: NOT isEmpty() && item is assigned //Postcondition: // IF items is in list at entry // first occurance of item in list is removed // && Length() = Length()@entry -1; // ELSE // list is not changed bool isPresent(ItemType item) const; //Precondition: item is assigned //Postcondition: == true if item is present in list // == false if item is not present in list void Print() const; //Postcondition: All component of list have been output private: int length; ItemType data[MAX_LENGTH]; void BinSearch(ItemType, bool&, int&) const; }; Source File: //Implementation File: slist.cpp //This file gives the specifications of a list abstract data type //List items inserted will be in order //Class SortList, structured type used to represent an ADT #include "iostream" #include "slist.h" using namespace std; // int length; // ItemType data[MAX_SIZE]; //Class Object (class instance) SortList. Variable of class type. SortList::SortList() //Constructor //Post-condition: Empty list is created { length=0; } //Const member function. Compiler error occurs if any statement within tries to modify a private data bool SortList::isEmpty() const //Post-condition: == true if list is empty // == false if list is not empty { return(length==0); } bool SortList::isFull() const //Post-condition: == true if list is full // == false if list is full { return (length==(MAX_LENGTH-1)); } int SortList::Length() const //Post-condition: size of list { return length; } void SortList::Insert(ItemType item) //Precondition: NOT isFull() && item is assigned //Postcondition: item is in list && Length() = Length()@entry + 1 // && list componenet are in ascending order of value { int index; index = length -1; while(index >=0 && item<data[index]) { data[index+1]=data[index]; index--; } data[index+1]=item; length++; } void SortList:elete(ItemType item) //Precondition: NOT isEmpty() && item is assigned //Postcondition: // IF items is in list at entry // first occurance of item in list is removed // && Length() = Length()@entry -1; // && list components are in ascending order // ELSE data array is unchanged { bool found; int position; BinSearch(item,found,position); if (found) { for(int index = position; index < length; index++) data[index]=data[index+1]; length--; } } bool SortList::isPresent(ItemType item) const //Precondition: item is assigned && length <= MAX_LENGTH && items are in ascending order //Postcondition: true if item is found in the list // false if item is not found in the list { bool found; int position; BinSearch(item,found,position); return (found); } void SortList::Print() const //Postcondition: All component of list have been output { for(int x= 0; x<length; x++) cout << data[x] << endl; } void SortList::BinSearch(ItemType item, bool found, int position) const //Precondition: item contains item to be found // && item in the list is an ascending order //Postcondition: IF item is in list, position is returned // ELSE item does not exist in the list { int first = 0; int last = length -1; int middle; found = false; while(!found) { middle = (first+last)/2; if(data[middle]<item) first = middle+1; else if (data[middle] > item) last = middle -1; else found = true; } if(found) position = middle; } I cannot get rid of the C1010 error: fatal error C1010: unexpected end of file while looking for precompiled header. Did you forget to add '#include "stdafx.h"' to your source? Is there a way to get rid of this error? When I included "stdafx.h" I received the following 32 errors (which does not make sense to me why because I referred back to my manual on how to use Class method - everything looks a.ok.) Error 1 error C2871: 'std' : a namespace with this name does not exist c:\..\slist.h 6 Error 2 error C2146: syntax error : missing ';' before identifier 'ItemType' c:\..\slist.h 8 Error 3 error C4430: missing type specifier - int assumed. Note: C++ does not support default-int c:\..\slist.h 8 Error 4 error C4430: missing type specifier - int assumed. Note: C++ does not support default-int c:\..\slist.h 8 Error 5 error C2061: syntax error : identifier 'ItemType' c:\..\slist.h 30 Error 6 error C2061: syntax error : identifier 'ItemType' c:\..\slist.h 34 Error 7 error C2061: syntax error : identifier 'ItemType' c:\..\slist.h 43 Error 8 error C2146: syntax error : missing ';' before identifier 'data' c:\..\slist.h 52 Error 9 error C4430: missing type specifier - int assumed. Note: C++ does not support default-int c:\..\slist.h 52 Error 10 error C4430: missing type specifier - int assumed. Note: C++ does not support default-int c:\..\slist.h 52 Error 11 error C2061: syntax error : identifier 'ItemType' c:\..\slist.h 53 Error 12 error C2146: syntax error : missing ')' before identifier 'item' c:\..\slist.cpp 41 Error 13 error C2761: 'void SortList::Insert(void)' : member function redeclaration not allowed c:\..\slist.cpp 41 Error 14 error C2059: syntax error : ')' c:\..\slist.cpp 41 Error 15 error C2143: syntax error : missing ';' before '{' c:\..\slist.cpp 45 Error 16 error C2447: '{' : missing function header (old-style formal list?) c:\..\slist.cpp 45 Error 17 error C2146: syntax error : missing ')' before identifier 'item' c:\..\slist.cpp 57 Error 18 error C2761: 'void SortList:elete(void)' : member function redeclaration not allowed c:\..\slist.cpp 57 Error 19 error C2059: syntax error : ')' c:\..\slist.cpp 57 Error 20 error C2143: syntax error : missing ';' before '{' c:\..\slist.cpp 65 Error 21 error C2447: '{' : missing function header (old-style formal list?) c:\..\slist.cpp 65 Error 22 error C2146: syntax error : missing ')' before identifier 'item' c:\..\slist.cpp 79 Error 23 error C2761: 'bool SortList::isPresent(void) const' : member function redeclaration not allowed c:\..\slist.cpp 79 Error 24 error C2059: syntax error : ')' c:\..\slist.cpp 79 Error 25 error C2143: syntax error : missing ';' before '{' c:\..\slist.cpp 83 Error 26 error C2447: '{' : missing function header (old-style formal list?) c:\..\slist.cpp 83 Error 27 error C2065: 'data' : undeclared identifier c:\..\slist.cpp 95 Error 28 error C2146: syntax error : missing ')' before identifier 'item' c:\..\slist.cpp 98 Error 29 error C2761: 'void SortList::BinSearch(void) const' : member function redeclaration not allowed c:\..\slist.cpp 98 Error 30 error C2059: syntax error : ')' c:\..\slist.cpp 98 Error 31 error C2143: syntax error : missing ';' before '{' c:\..\slist.cpp 103 Error 32 error C2447: '{' : missing function header (old-style formal list?) c:\..\slist.cpp 103

    Read the article

  • SVN Error 403 Forbidden

    - by Chris
    I can't figure this out. I try to import a new project into a svn repository from Netbeans and get 403 Forbidden. I just setup svn on my serverbox today. I can get to it through a browser just fine, though its empty as I haven't imported my project yet. Apache's path for html files is /var/www I setup the svn repo in /var/svn This is the structure of /var/svn [root@localhost svn]# ls -lR /var/svn /var/svn: total 4 drwxrwxrwx 7 apache apache 4096 2010-03-26 10:18 repo /var/svn/repo: total 36 drwxrwxrwx 2 apache apache 4096 2010-03-26 09:47 conf drwxrwxrwx 3 apache apache 4096 2010-03-26 10:18 dav drwxrwsrwx 6 apache apache 4096 2010-03-26 11:19 db -rwxrwxrwx 1 apache apache 2 2010-03-26 09:47 format drwxrwxrwx 2 apache apache 4096 2010-03-26 09:47 hooks drwxrwxrwx 2 apache apache 4096 2010-03-26 09:47 locks -rwxrwxrwx 1 apache apache 229 2010-03-26 09:47 README.txt -rwxrwxrwx 1 apache apache 15 2010-03-26 09:47 svnauth -rwxrwxrwx 1 apache apache 43 2010-03-26 09:48 svnpass /var/svn/repo/conf: total 12 -rwxrwxrwx 1 apache apache 1080 2010-03-26 09:47 authz -rwxrwxrwx 1 apache apache 309 2010-03-26 09:47 passwd -rwxrwxrwx 1 apache apache 2279 2010-03-26 09:47 svnserve.conf /var/svn/repo/dav: total 4 drwxrwxrwx 2 apache apache 4096 2010-03-26 11:19 activities.d /var/svn/repo/dav/activities.d: total 0 /var/svn/repo/db: total 48 -rwxrwxrwx 1 apache apache 2 2010-03-26 09:47 current -rwxrwxrwx 1 apache apache 22 2010-03-26 09:47 format -rwxrwxrwx 1 apache apache 1920 2010-03-26 09:47 fsfs.conf -rwxrwxrwx 1 apache apache 5 2010-03-26 09:47 fs-type -rwxrwxrwx 1 apache apache 2 2010-03-26 09:47 min-unpacked-rev -rwxrwxrwx 1 apache apache 4096 2010-03-26 09:47 rep-cache.db drwxrwsrwx 3 apache apache 4096 2010-03-26 09:47 revprops drwxrwsrwx 3 apache apache 4096 2010-03-26 09:47 revs drwxrwsrwx 2 apache apache 4096 2010-03-26 11:19 transactions -rwxrwxrwx 1 apache apache 2 2010-03-26 11:19 txn-current -rwxrwxrwx 1 apache apache 0 2010-03-26 09:47 txn-current-lock drwxrwsrwx 2 apache apache 4096 2010-03-26 11:19 txn-protorevs -rwxrwxrwx 1 apache apache 37 2010-03-26 09:47 uuid -rwxrwxrwx 1 apache apache 0 2010-03-26 09:47 write-lock /var/svn/repo/db/revprops: total 4 drwxrwsrwx 2 apache apache 4096 2010-03-26 09:47 0 /var/svn/repo/db/revprops/0: total 4 -rwxrwxrwx 1 apache apache 50 2010-03-26 09:47 0 /var/svn/repo/db/revs: total 4 drwxrwsrwx 2 apache apache 4096 2010-03-26 09:47 0 /var/svn/repo/db/revs/0: total 4 -rwxrwxrwx 1 apache apache 115 2010-03-26 09:47 0 /var/svn/repo/db/transactions: total 0 /var/svn/repo/db/txn-protorevs: total 0 /var/svn/repo/hooks: total 36 -rwxrwxrwx 1 apache apache 1955 2010-03-26 09:47 post-commit.tmpl -rwxrwxrwx 1 apache apache 1638 2010-03-26 09:47 post-lock.tmpl -rwxrwxrwx 1 apache apache 2267 2010-03-26 09:47 post-revprop-change.tmpl -rwxrwxrwx 1 apache apache 1567 2010-03-26 09:47 post-unlock.tmpl -rwxrwxrwx 1 apache apache 3404 2010-03-26 09:47 pre-commit.tmpl -rwxrwxrwx 1 apache apache 2410 2010-03-26 09:47 pre-lock.tmpl -rwxrwxrwx 1 apache apache 2764 2010-03-26 09:47 pre-revprop-change.tmpl -rwxrwxrwx 1 apache apache 2100 2010-03-26 09:47 pre-unlock.tmpl -rwxrwxrwx 1 apache apache 2758 2010-03-26 09:47 start-commit.tmpl /var/svn/repo/locks: total 8 -rwxrwxrwx 1 apache apache 139 2010-03-26 09:47 db.lock -rwxrwxrwx 1 apache apache 139 2010-03-26 09:47 db-logs.lock I've got httpd.conf loading svn.conf which contains: <Location /svn> DAV on DAV svn #SVNParentPath /var/svn SVNPath /var/svn/repo Authtype Basic AuthName "Subversion" AuthUserFile /var/svn/repo/svnpass Require valid-user AuthzSVNAccessFile /var/svn/repo/svnauth </Location> Full error message is: org.tigris.subversion.javahl.ClientException: RA layer request failed Server sent unexpected return value (403 Forbidden) in response to CHECKOUT request for '/svn/!svn/bln/0' Sorry for the incredibly long post, but I thought more info would be better than less. I've been fidgeting with this problem for a long time now.

    Read the article

  • my asp.net mvc 2.0 application fails with error "No parameterless constructor defined for this objec

    - by loviji
    Hello, i'm new in asp.net mvc 2. I'm trying to list all data from one table(ms sql server table). as ORM I use Entity Framework. now, I'm tried to write something to do this: Model: private uqsEntities _uqsEntity; public permissionRepository(uqsEntities uqsEntity) { _uqsEntity = uqsEntity; } public IEnumerable<userPermissions> getAllData() { return _uqsEntity.userPermissions; } controller: private DataManager _dataManager; public ManagePermissionsController(DataManager datamanager) { _dataManager = datamanager; } public ActionResult Index() { return RedirectToAction("List"); } [AcceptVerbs(HttpVerbs.Get)] public ActionResult List() { return List(null); } [AcceptVerbs(HttpVerbs.Post)] public ActionResult List(int? userID) { return View(_dataManager.Permission.getAllData().ToList()); } Route: routes.MapRoute( "ManagePerm", // Route name "{controller}/{action}/{id}", // URL with parameters new { controller = "ManagePermissions", action = "Index"}, new string[] { "uqs.Controllers" } // Parameter defaults ); and View automatically generated by Visual Studio(in action mouse right-click). when I run app. , my app. fails. No parameterless constructor defined for this object. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.MissingMethodException: No parameterless constructor defined for this object. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace: [MissingMethodException: No parameterless constructor defined for this object.] System.RuntimeTypeHandle.CreateInstance(RuntimeType type, Boolean publicOnly, Boolean noCheck, Boolean& canBeCached, RuntimeMethodHandle& ctor, Boolean& bNeedSecurityCheck) +0 System.RuntimeType.CreateInstanceSlow(Boolean publicOnly, Boolean fillCache) +86 System.RuntimeType.CreateInstanceImpl(Boolean publicOnly, Boolean skipVisibilityChecks, Boolean fillCache) +230 System.Activator.CreateInstance(Type type, Boolean nonPublic) +67 System.Web.Mvc.DefaultControllerFactory.GetControllerInstance(RequestContext requestContext, Type controllerType) +80 [InvalidOperationException: An error occurred when trying to create a controller of type 'uqs.Controllers.ManagePermissionsController'. Make sure that the controller has a parameterless public constructor.] System.Web.Mvc.DefaultControllerFactory.GetControllerInstance(RequestContext requestContext, Type controllerType) +190 System.Web.Mvc.DefaultControllerFactory.CreateController(RequestContext requestContext, String controllerName) +68 System.Web.Mvc.MvcHandler.ProcessRequestInit(HttpContextBase httpContext, IController& controller, IControllerFactory& factory) +118 System.Web.Mvc.MvcHandler.BeginProcessRequest(HttpContextBase httpContext, AsyncCallback callback, Object state) +46 System.Web.Mvc.MvcHandler.BeginProcessRequest(HttpContext httpContext, AsyncCallback callback, Object state) +63 System.Web.Mvc.MvcHandler.System.Web.IHttpAsyncHandler.BeginProcessRequest(HttpContext context, AsyncCallback cb, Object extraData) +13 System.Web.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() +8679186 System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) +155 please, somebody, help me to catch problem.

    Read the article

  • Trac vs. Redmine vs. JIRA vs. FogBugz for one-man shop?

    - by kizzx2
    Background I am a one-man freelancer looking for a project management software that can provide the following requirements. I have used Trac for about a year now. Tried Redmine and FogBugz on Demand for a couple of weeks. Never tried JIRA before. Basically, I'm looking for a piece of software that: Facilitates developer-client communication/collaboration Does time tracking Requirements Record time estimates/Time tracking Clients must be able to create/edit his own tickets/cases Clients must not see Developer created tickets/cases (internal) Affordable (price) with multiple clients Nice-to-haves Supports multiple projects in one installation Free eclipse integration (Mylyn) Easy time-tracking without using the Web UI (Trac's post commit hook or Redmine's commit message scanning) Clients can access the Wiki Export the data to standard formats My evaluation Trac can basically fulfill most of the above requirements, but with lots of customizations and plug-ins that it doesn't feel so clean. One downside is that the main trunk (0.11) has been around for a year or more and I still haven't seen much tendency of any upgrades coming up. Redmine has the cleanest Web UI. It's design philosophy seems to be the most elegant, with its innovative commit message scanning and stuff. However, the current version doesn't seem to be very mature and stable yet. It doesn't support internal (private) tickets and the time-tracking commit message patch doesn't support the trunk version. The good thing about it is that the main trunk still seems to be actively developed. FogBugz is actually a very well written piece of software. However the idea of paying $25/month for the client to be able to log-in to the system seems a little bit too far off for an individual developer. The free version supports letting clients create/view their own cases using email, which is a sub-optimal alternative to having a full-fledged list of the user's own cases. That also means clients can't read/write wiki pages. Its time-tracking approach is innovative and good though. However the fact that all the eclipse integration (Bugclipse, Foglyn) are commercial. Yet other investments before I can use my bug-tracker! If I revert back to the Web UI, it's not really a fast rendering Web service. Also, the in-built report functions are excellent (e.g. evidence based scheduling) JIRA is something I have zero experience with. Can someone with JIRA experience recommend why it might be a good fit for this particular situation? Question Can we share experience on this? Any specific plugins/customizations would that would best suit the requirements for this case?

    Read the article

  • Silverlight 4 WCF RIA Services and MVVM is not as simple

    - by Thomas Jaskula
    [Disclaimer: I'm ASP.NET MVC Developer] Hi, I'm looking for some best practices with implementing MVVM pattern with WCF RIA in Silverlight 4. I'm not looking to use MEF of IoC for locating my ViewModels. What I would like to know is how to apply MVVM pattern with Silverlight 4 and WCF RIA. I don't want to use other stuff like Prism or MVVM Light toolkit. I found many examples on Internet showing how it is wonderful to drag and drop a datasource on the view and the job is done (it reminds me about my first VB6 developments). I tried to implement MVVM with WCF RIA and it's not strightforward at all. If I understand, the MVVM should contain all the logic in order to unit test it in isolation but when it comes to combine it with WCF RIA it's another story. I have the following questions. Can I use a generated metadata as model ? It would be easier to use it that if I write all from the scratch. As I saw the only way I could get data is through DomainContext or through direct binding in the view (local ressource). I don't want the direct binding in the view, not testable at all. On the other hand I can't use DomainContext, it doesn't expose any single entity !!! All I have is the EntitySet that I can bind to datagrid. How do I bind a single Entity to the DataForm from the ViewModel ? How do I udpate the model to the database ? How do I navigate from one Entity to a collection of it's itemps. For example if I have a Company Entity I would like to show a DataFrom to update a entite informations and a datagrid to show companies adresses. When saving a form would like to save information to Company and for example an information avout which adress was selected as active. Please help me understand how to do it well. Or maybe I should drop the WCF RIA and to do it with WCF from scratch ? What do you think ?

    Read the article

  • How to convert a DataSet object into an ObjectContext (Entity Framework) object on the fly?

    - by Marcel
    Hi all, I have an existing SQL Server database, where I store data from large specific log files (often 100 MB and more), one per database. After some analysis, the database is deleted again. From the database, I have created both a Entity Framework Model and a DataSet Model via the Visual Studio designers. The DataSet is only for bulk importing data with SqlBulkCopy, after a quite complicated parsing process. All queries are then done using the Entity Framework Model, whose CreateQuery Method is exposed via an interface like this public IQueryable<TTarget> GetResults<TTarget>() where TTarget : EntityObject, new() { return this.Context.CreateQuery<TTarget>(typeof(TTarget).Name); } Now, sometimes my files are very small and in such a case I would like to omit the import into the database, but just have a an in-memory representation of the data, accessible as Entities. The idea is to create the DataSet, but instead of bulk importing, to directly transfer it into an ObjectContext which is accessible via the interface. Does this make sense? Now here's what I have done for this conversion so far: I traverse all tables in the DataSet, convert the single rows into entities of the corresponding type and add them to instantiated object of my typed Entity context class, like so MyEntities context = new MyEntities(); //create new in-memory context ///.... //get the item in the navigations table MyDataSet.NavigationResultRow dataRow = ds.NavigationResult.First(); //here, a foreach would be necessary in a true-world scenario NavigationResult entity = new NavigationResult { Direction = dataRow.Direction, ///... NavigationResultID = dataRow.NavigationResultID }; //convert to entities context.AddToNavigationResult(entity); //add to entities ///.... A very tedious work, as I would need to create a converter for each of my entity type and iterate over each table in the DataSet I have. Beware, if I ever change my database model.... Also, I have found out, that I can only instantiate MyEntities, if I provide a valid connection string to a SQL Server database. Since I do not want to actually write to my fully fledged database each time, this hinders my intentions. I intend to have only some in-memory proxy database. Can I do simpler? Is there some automated way of doing such a conversion, like generating an ObjectContext out of a DataSet object? P.S: I have seen a few questions about unit testing that seem somewhat related, but not quite exact.

    Read the article

  • RegisterRequiresControlState can only be called before and during PreRender.

    - by user203127
    Hi When i am trying to export data from gridview to excelML I am getting error like RegisterRequiresControlState can only be called before and during PreRender. I dont know why it is happening. Please help me to sove this issue. Here is my sample code. I didnt implement any prerender method in my code. If i need to implement what should i write in that. protected void Button4_Click(object sender, System.EventArgs e) { ConfigureExport(); RadGrid1.ExportSettings.Excel.Format = Telerik.Web.UI.GridExcelExportFormat.ExcelML; CheckBox1.Checked = true; RadGrid1.ExportSettings.ExportOnlyData = true; RadGrid1.MasterTableView.ExportToExcel(); } public void ConfigureExport() { RadGrid1.ExportSettings.ExportOnlyData = CheckBox1.Checked; RadGrid1.ExportSettings.IgnorePaging = CheckBox2.Checked; RadGrid1.ExportSettings.OpenInNewWindow = CheckBox3.Checked; } protected void RadGrid1_ExcelMLExportRowCreated(object source, Telerik.Web.UI.GridExcelBuilder.GridExportExcelMLRowCreatedArgs e) { if (e.RowType == Telerik.Web.UI.GridExcelBuilder.GridExportExcelMLRowType.DataRow) { if (e.Row.Cells[0] != null && ((string)e.Row.Cells[0].Data.DataItem).Contains("U")) { e.Row.Cells[0].StyleValue = "MyCustomStyle"; e.Worksheet.Name = "comcast"; } } } protected void RadGrid1_ExcelMLExportStylesCreated(object source, Telerik.Web.UI.GridExcelBuilder.GridExportExcelMLStyleCreatedArgs e) { foreach (Telerik.Web.UI.GridExcelBuilder.StyleElement style in e.Styles) { if (style.Id == "headerStyle") { style.FontStyle.Bold = true; style.FontStyle.Color = System.Drawing.Color.Gainsboro; style.InteriorStyle.Color = System.Drawing.Color.Wheat; style.InteriorStyle.Pattern = Telerik.Web.UI.GridExcelBuilder.InteriorPatternType.Solid; } else if (style.Id == "itemStyle") { style.InteriorStyle.Color = System.Drawing.Color.WhiteSmoke; style.InteriorStyle.Pattern = Telerik.Web.UI.GridExcelBuilder.InteriorPatternType.Solid; } else if (style.Id == "alternatingItemStyle") { style.InteriorStyle.Color = System.Drawing.Color.LightGray; style.InteriorStyle.Pattern = Telerik.Web.UI.GridExcelBuilder.InteriorPatternType.Solid; } } Telerik.Web.UI.GridExcelBuilder.StyleElement myStyle = new Telerik.Web.UI.GridExcelBuilder.StyleElement("MyCustomStyle"); myStyle.FontStyle.Bold = true; myStyle.FontStyle.Italic = true; myStyle.InteriorStyle.Color = System.Drawing.Color.Gray; myStyle.InteriorStyle.Pattern = Telerik.Web.UI.GridExcelBuilder.InteriorPatternType.Solid; e.Styles.Add(myStyle); }

    Read the article

  • Doing TDD Silverlight 4 RC using Visual Studio 2010 RC

    - by user133992
    First I am glad to see better TDD support in VS2010. Support for generating code stubs from my tests is ok - not as good as more mature TDD plug-ins but a good start. I am looking for some best Silverlight 4.0 TDD practices. First Question: Anyone have links, recommendations? I know the new Silverlight Unit Test capabilities are much better (Jeff Wilcox's Mix Presentation). What I am focusing on right now is using TDD to develop pure Silverlight 4.0 Class Library projects - projects without a Silverlight UI project. I've been able to get it to work but not as cleanly as it should be. I can create an Empty VS project. Add A Silverlight 4 Class Library Project. Add a TestProject (not a silverlight Unit Test Project but a plain Test Project). Add a simple test in the Test Project such as: namespace Calculator.Test { [TestClass] public class CalculatorTests { [TestMethod] public void CalulatorAddTest() { Calc c = new Calc(); int expected = 10; int actual = c.Add(6, 4); Assert.AreEqual<int>(expected, actual); } } } Using the new Generate Type and Method from Test feature it will generate the following code in the Silverlight Project: namespace Calculator { public class Calc { public int Add(int p, int p_2) { throw new NotImplementedException(); } } } When I run the tests the first time it says the target assembly is Silverlight and not able to run test - Not exact text but the same general idea. When I change the implementation to: namespace Calculator { public class Calc { public int Add(int p, int p_2) { return p + p_2; } } } and re-run the test, it works fine and the test goes green. It also works for all other TDD code I generate after. I also get a warning Mark in the Test Project's reference to the Calculator Silverlight Class Library Assembly. Second Question: Any comments ideas if this just a bug in VS2010 RC or is Silverlight Class Library TDD not really supported. I have not created a Silverlight UI project or changed and build or debug settings so I have no idea what is hosting the silverlight DLL. Finally, some of the Silverlight Class Libraries I need to write will provide functionality that requires elevated Out-Of-Browser rights. Based on the above, it looks like I can use TDD Test Projects against regular Silverlight 4.0 Class Libraries, but I have no idea how I can TDD the elevated OOB functionality without also creating the UI component that gets installed. The UI piece is not really needed for the Library development and gets in the way of what I actually want to TDD. I know I can (and will) mock some of that functionality but at some point I will also need the real thing in my tests. Third Question: Any ideas how to TDD Silverlight 4.0 Class Library project that requires OOB elevated rights? Thanks!

    Read the article

  • SunTlsRsaPremasterSecret KeyGenerator not available

    - by Jill
    Hi, I encountered an error when my application tries to load a RSA Algorithm provider class from JAVA. The exception stack is as follow: javax.jms.JMSException: RSA premaster secret error at org.apache.activemq.util.JMSExceptionSupport.create(JMSExceptionSupport.java:49) at org.apache.activemq.ActiveMQConnection.syncSendPacket(ActiveMQConnection.java:1255) at org.apache.activemq.ActiveMQConnection.ensureConnectionInfoSent(ActiveMQConnection.java:1350) at org.apache.activemq.ActiveMQConnection.setClientID(ActiveMQConnection.java:388) at com.trendmicro.tmsm.TMSMAgent.open(TMSMAgent.java:63) Caused by: javax.net.ssl.SSLKeyException: RSA premaster secret error at com.sun.net.ssl.internal.ssl.RSAClientKeyExchange.<init>(RSAClientKeyExchange.java:97) at com.sun.net.ssl.internal.ssl.ClientHandshaker.serverHelloDone(ClientHandshaker.java:634) at com.sun.net.ssl.internal.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:226) at com.sun.net.ssl.internal.ssl.Handshaker.processLoop(Handshaker.java:516) at com.sun.net.ssl.internal.ssl.Handshaker.process_record(Handshaker.java:454) at com.sun.net.ssl.internal.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:884) at com.sun.net.ssl.internal.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1112) at com.sun.net.ssl.internal.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:623) at com.sun.net.ssl.internal.ssl.AppOutputStream.write(AppOutputStream.java:59) at org.apache.activemq.transport.tcp.TcpBufferedOutputStream.flush(TcpBufferedOutputStream.java:115) at java.io.DataOutputStream.flush(DataOutputStream.java:106) at org.apache.activemq.transport.tcp.TcpTransport.oneway(TcpTransport.java:167) at org.apache.activemq.transport.InactivityMonitor.oneway(InactivityMonitor.java:237) at org.apache.activemq.transport.WireFormatNegotiator.sendWireFormat(WireFormatNegotiator.java:168) at org.apache.activemq.transport.WireFormatNegotiator.sendWireFormat(WireFormatNegotiator.java:84) at org.apache.activemq.transport.WireFormatNegotiator.start(WireFormatNegotiator.java:74) at org.apache.activemq.transport.failover.FailoverTransport.doReconnect(FailoverTransport.java:715) at org.apache.activemq.transport.failover.FailoverTransport$2.iterate(FailoverTransport.java:115) at org.apache.activemq.thread.PooledTaskRunner.runTask(PooledTaskRunner.java:122) at org.apache.activemq.thread.PooledTaskRunner$1.run(PooledTaskRunner.java:43) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:637) Caused by: java.security.NoSuchAlgorithmException: SunTlsRsaPremasterSecret KeyGenerator not available at javax.crypto.KeyGenerator.<init>(DashoA13*..) at javax.crypto.KeyGenerator.getInstance(DashoA13*..) at com.sun.net.ssl.internal.ssl.JsseJce.getKeyGenerator(JsseJce.java:223) at com.sun.net.ssl.internal.ssl.RSAClientKeyExchange.<init>(RSAClientKeyExchange.java:89) ... 22 more I've googled the error message and most of posts says it's because JVM cannot find sunjce_provider.jar. However, I can find the file in /Library/Java/Home/lib/ext folder. The platform is Mac OS X 10.6 and Java version is 1.6.0_17. My questions are: Why JVM does not search /Library/Java/Home/lib/ext for jar files? Can we change CLASSPATH or java.ext.dirs property by modify any config file? Any suggestion to solve this problem? Thanks in advance.

    Read the article

  • problem with google chrome

    - by user365559
    hi. i have javscript file for history management.IT is not supported by chrome when i am trying to navigate to back page with backbutton in the browser.I can see the url change but it doesnt go to preceeding page. BrowserHistoryUtils = { addEvent: function(elm, evType, fn, useCapture) { useCapture = useCapture || false; if (elm.addEventListener) { elm.addEventListener(evType, fn, useCapture); return true; } else if (elm.attachEvent) { var r = elm.attachEvent('on' + evType, fn); return r; } else { elm['on' + evType] = fn; } } } BrowserHistory = (function() { // type of browser var browser = { ie: false, firefox: false, safari: false, opera: false, version: -1 }; // if setDefaultURL has been called, our first clue // that the SWF is ready and listening //var swfReady = false; // the URL we'll send to the SWF once it is ready //var pendingURL = ''; // Default app state URL to use when no fragment ID present var defaultHash = ''; // Last-known app state URL var currentHref = document.location.href; // Initial URL (used only by IE) var initialHref = document.location.href; // Initial URL (used only by IE) var initialHash = document.location.hash; // History frame source URL prefix (used only by IE) var historyFrameSourcePrefix = 'history/historyFrame.html?'; // History maintenance (used only by Safari) var currentHistoryLength = -1; var historyHash = []; var initialState = createState(initialHref, initialHref + '#' + initialHash, initialHash); var backStack = []; var forwardStack = []; var currentObjectId = null; //UserAgent detection var useragent = navigator.userAgent.toLowerCase(); if (useragent.indexOf("opera") != -1) { browser.opera = true; } else if (useragent.indexOf("msie") != -1) { browser.ie = true; browser.version = parseFloat(useragent.substring(useragent.indexOf('msie') + 4)); } else if (useragent.indexOf("safari") != -1) { browser.safari = true; browser.version = parseFloat(useragent.substring(useragent.indexOf('safari') + 7)); } else if (useragent.indexOf("gecko") != -1) { browser.firefox = true; } if (browser.ie == true && browser.version == 7) { window["_ie_firstload"] = false; } // Accessor functions for obtaining specific elements of the page. function getHistoryFrame() { return document.getElementById('ie_historyFrame'); } function getAnchorElement() { return document.getElementById('firefox_anchorDiv'); } function getFormElement() { return document.getElementById('safari_formDiv'); } function getRememberElement() { return document.getElementById("safari_remember_field"); } // Get the Flash player object for performing ExternalInterface callbacks. // Updated for changes to SWFObject2. function getPlayer(id) { if (id && document.getElementById(id)) { var r = document.getElementById(id); if (typeof r.SetVariable != "undefined") { return r; } else { var o = r.getElementsByTagName("object"); var e = r.getElementsByTagName("embed"); if (o.length > 0 && typeof o[0].SetVariable != "undefined") { return o[0]; } else if (e.length > 0 && typeof e[0].SetVariable != "undefined") { return e[0]; } } } else { var o = document.getElementsByTagName("object"); var e = document.getElementsByTagName("embed"); if (e.length > 0 && typeof e[0].SetVariable != "undefined") { return e[0]; } else if (o.length > 0 && typeof o[0].SetVariable != "undefined") { return o[0]; } else if (o.length > 1 && typeof o[1].SetVariable != "undefined") { return o[1]; } } return undefined; } function getPlayers() { var players = []; if (players.length == 0) { var tmp = document.getElementsByTagName('object'); players = tmp; } if (players.length == 0 || players[0].object == null) { var tmp = document.getElementsByTagName('embed'); players = tmp; } return players; } function getIframeHash() { var doc = getHistoryFrame().contentWindow.document; var hash = String(doc.location.search); if (hash.length == 1 && hash.charAt(0) == "?") { hash = ""; } else if (hash.length >= 2 && hash.charAt(0) == "?") { hash = hash.substring(1); } return hash; } /* Get the current location hash excluding the '#' symbol. */ function getHash() { // It would be nice if we could use document.location.hash here, // but it's faulty sometimes. var idx = document.location.href.indexOf('#'); return (idx >= 0) ? document.location.href.substr(idx+1) : ''; } /* Get the current location hash excluding the '#' symbol. */ function setHash(hash) { // It would be nice if we could use document.location.hash here, // but it's faulty sometimes. if (hash == '') hash = '#' document.location.hash = hash; } function createState(baseUrl, newUrl, flexAppUrl) { return { 'baseUrl': baseUrl, 'newUrl': newUrl, 'flexAppUrl': flexAppUrl, 'title': null }; } /* Add a history entry to the browser. * baseUrl: the portion of the location prior to the '#' * newUrl: the entire new URL, including '#' and following fragment * flexAppUrl: the portion of the location following the '#' only */ function addHistoryEntry(baseUrl, newUrl, flexAppUrl) { //delete all the history entries forwardStack = []; if (browser.ie) { //Check to see if we are being asked to do a navigate for the first //history entry, and if so ignore, because it's coming from the creation //of the history iframe if (flexAppUrl == defaultHash && document.location.href == initialHref && window['_ie_firstload']) { currentHref = initialHref; return; } if ((!flexAppUrl || flexAppUrl == defaultHash) && window['_ie_firstload']) { newUrl = baseUrl + '#' + defaultHash; flexAppUrl = defaultHash; } else { // for IE, tell the history frame to go somewhere without a '#' // in order to get this entry into the browser history. getHistoryFrame().src = historyFrameSourcePrefix + flexAppUrl; } setHash(flexAppUrl); } else { //ADR if (backStack.length == 0 && initialState.flexAppUrl == flexAppUrl) { initialState = createState(baseUrl, newUrl, flexAppUrl); } else if(backStack.length > 0 && backStack[backStack.length - 1].flexAppUrl == flexAppUrl) { backStack[backStack.length - 1] = createState(baseUrl, newUrl, flexAppUrl); } if (browser.safari) { // for Safari, submit a form whose action points to the desired URL if (browser.version <= 419.3) { var file = window.location.pathname.toString(); file = file.substring(file.lastIndexOf("/")+1); getFormElement().innerHTML = '<form name="historyForm" action="'+file+'#' + flexAppUrl + '" method="GET"></form>'; //get the current elements and add them to the form var qs = window.location.search.substring(1); var qs_arr = qs.split("&"); for (var i = 0; i < qs_arr.length; i++) { var tmp = qs_arr[i].split("="); var elem = document.createElement("input"); elem.type = "hidden"; elem.name = tmp[0]; elem.value = tmp[1]; document.forms.historyForm.appendChild(elem); } document.forms.historyForm.submit(); } else { top.location.hash = flexAppUrl; } // We also have to maintain the history by hand for Safari historyHash[history.length] = flexAppUrl; _storeStates(); } else { // Otherwise, write an anchor into the page and tell the browser to go there addAnchor(flexAppUrl); setHash(flexAppUrl); } } backStack.push(createState(baseUrl, newUrl, flexAppUrl)); } function _storeStates() { if (browser.safari) { getRememberElement().value = historyHash.join(","); } } function handleBackButton() { //The "current" page is always at the top of the history stack. var current = backStack.pop(); if (!current) { return; } var last = backStack[backStack.length - 1]; if (!last && backStack.length == 0){ last = initialState; } forwardStack.push(current); } function handleForwardButton() { //summary: private method. Do not call this directly. var last = forwardStack.pop(); if (!last) { return; } backStack.push(last); } function handleArbitraryUrl() { //delete all the history entries forwardStack = []; } /* Called periodically to poll to see if we need to detect navigation that has occurred */ function checkForUrlChange() { if (browser.ie) { if (currentHref != document.location.href && currentHref + '#' != document.location.href) { //This occurs when the user has navigated to a specific URL //within the app, and didn't use browser back/forward //IE seems to have a bug where it stops updating the URL it //shows the end-user at this point, but programatically it //appears to be correct. Do a full app reload to get around //this issue. if (browser.version < 7) { currentHref = document.location.href; document.location.reload(); } else { if (getHash() != getIframeHash()) { // this.iframe.src = this.blankURL + hash; var sourceToSet = historyFrameSourcePrefix + getHash(); getHistoryFrame().src = sourceToSet; } } } } if (browser.safari) { // For Safari, we have to check to see if history.length changed. if (currentHistoryLength >= 0 && history.length != currentHistoryLength) { //alert("did change: " + history.length + ", " + historyHash.length + "|" + historyHash[history.length] + "|>" + historyHash.join("|")); // If it did change, then we have to look the old state up // in our hand-maintained array since document.location.hash // won't have changed, then call back into BrowserManager. currentHistoryLength = history.length; var flexAppUrl = historyHash[currentHistoryLength]; if (flexAppUrl == '') { //flexAppUrl = defaultHash; } //ADR: to fix multiple if (typeof BrowserHistory_multiple != "undefined" && BrowserHistory_multiple == true) { var pl = getPlayers(); for (var i = 0; i < pl.length; i++) { pl[i].browserURLChange(flexAppUrl); } } else { getPlayer().browserURLChange(flexAppUrl); } _storeStates(); } } if (browser.firefox) { if (currentHref != document.location.href) { var bsl = backStack.length; var urlActions = { back: false, forward: false, set: false } if ((window.location.hash == initialHash || window.location.href == initialHref) && (bsl == 1)) { urlActions.back = true; // FIXME: could this ever be a forward button? // we can't clear it because we still need to check for forwards. Ugg. // clearInterval(this.locationTimer); handleBackButton(); } // first check to see if we could have gone forward. We always halt on // a no-hash item. if (forwardStack.length > 0) { if (forwardStack[forwardStack.length-1].flexAppUrl == getHash()) { urlActions.forward = true; handleForwardButton(); } } // ok, that didn't work, try someplace back in the history stack if ((bsl >= 2) && (backStack[bsl - 2])) { if (backStack[bsl - 2].flexAppUrl == getHash()) { urlActions.back = true; handleBackButton(); } } if (!urlActions.back && !urlActions.forward) { var foundInStacks = { back: -1, forward: -1 } for (var i = 0; i < backStack.length; i++) { if (backStack[i].flexAppUrl == getHash() && i != (bsl - 2)) { arbitraryUrl = true; foundInStacks.back = i; } } for (var i = 0; i < forwardStack.length; i++) { if (forwardStack[i].flexAppUrl == getHash() && i != (bsl - 2)) { arbitraryUrl = true; foundInStacks.forward = i; } } handleArbitraryUrl(); } // Firefox changed; do a callback into BrowserManager to tell it. currentHref = document.location.href; var flexAppUrl = getHash(); if (flexAppUrl == '') { //flexAppUrl = defaultHash; } //ADR: to fix multiple if (typeof BrowserHistory_multiple != "undefined" && BrowserHistory_multiple == true) { var pl = getPlayers(); for (var i = 0; i < pl.length; i++) { pl[i].browserURLChange(flexAppUrl); } } else { getPlayer().browserURLChange(flexAppUrl); } } } //setTimeout(checkForUrlChange, 50); } /* Write an anchor into the page to legitimize it as a URL for Firefox et al. */ function addAnchor(flexAppUrl) { if (document.getElementsByName(flexAppUrl).length == 0) { getAnchorElement().innerHTML += "<a name='" + flexAppUrl + "'>" + flexAppUrl + "</a>"; } } var _initialize = function () { if (browser.ie) { var scripts = document.getElementsByTagName('script'); for (var i = 0, s; s = scripts[i]; i++) { if (s.src.indexOf("history.js") > -1) { var iframe_location = (new String(s.src)).replace("history.js", "historyFrame.html"); } } historyFrameSourcePrefix = iframe_location + "?"; var src = historyFrameSourcePrefix; var iframe = document.createElement("iframe"); iframe.id = 'ie_historyFrame'; iframe.name = 'ie_historyFrame'; //iframe.src = historyFrameSourcePrefix; try { document.body.appendChild(iframe); } catch(e) { setTimeout(function() { document.body.appendChild(iframe); }, 0); } } if (browser.safari) { var rememberDiv = document.createElement("div"); rememberDiv.id = 'safari_rememberDiv'; document.body.appendChild(rememberDiv); rememberDiv.innerHTML = '<input type="text" id="safari_remember_field" style="width: 500px;">'; var formDiv = document.createElement("div"); formDiv.id = 'safari_formDiv'; document.body.appendChild(formDiv); var reloader_content = document.createElement('div'); reloader_content.id = 'safarireloader'; var scripts = document.getElementsByTagName('script'); for (var i = 0, s; s = scripts[i]; i++) { if (s.src.indexOf("history.js") > -1) { html = (new String(s.src)).replace(".js", ".html"); } } reloader_content.innerHTML = '<iframe id="safarireloader-iframe" src="about:blank" frameborder="no" scrolling="no"></iframe>'; document.body.appendChild(reloader_content); reloader_content.style.position = 'absolute'; reloader_content.style.left = reloader_content.style.top = '-9999px'; iframe = reloader_content.getElementsByTagName('iframe')[0]; if (document.getElementById("safari_remember_field").value != "" ) { historyHash = document.getElementById("safari_remember_field").value.split(","); } } if (browser.firefox) { var anchorDiv = document.createElement("div"); anchorDiv.id = 'firefox_anchorDiv'; document.body.appendChild(anchorDiv); } //setTimeout(checkForUrlChange, 50); } return { historyHash: historyHash, backStack: function() { return backStack; }, forwardStack: function() { return forwardStack }, getPlayer: getPlayer, initialize: function(src) { _initialize(src); }, setURL: function(url) { document.location.href = url; }, getURL: function() { return document.location.href; }, getTitle: function() { return document.title; }, setTitle: function(title) { try { backStack[backStack.length - 1].title = title; } catch(e) { } //if on safari, set the title to be the empty string. if (browser.safari) { if (title == "") { try { var tmp = window.location.href.toString(); title = tmp.substring((tmp.lastIndexOf("/")+1), tmp.lastIndexOf("#")); } catch(e) { title = ""; } } } document.title = title; }, setDefaultURL: function(def) { defaultHash = def; def = getHash(); //trailing ? is important else an extra frame gets added to the history //when navigating back to the first page. Alternatively could check //in history frame navigation to compare # and ?. if (browser.ie) { window['_ie_firstload'] = true; var sourceToSet = historyFrameSourcePrefix + def; var func = function() { getHistoryFrame().src = sourceToSet; window.location.replace("#" + def); setInterval(checkForUrlChange, 50); } try { func(); } catch(e) { window.setTimeout(function() { func(); }, 0); } } if (browser.safari) { currentHistoryLength = history.length; if (historyHash.length == 0) { historyHash[currentHistoryLength] = def; var newloc = "#" + def; window.location.replace(newloc); } else { //alert(historyHash[historyHash.length-1]); } //setHash(def); setInterval(checkForUrlChange, 50); } if (browser.firefox || browser.opera) { var reg = new RegExp("#" + def + "$"); if (window.location.toString().match(reg)) { } else { var newloc ="#" + def; window.location.replace(newloc); } setInterval(checkForUrlChange, 50); //setHash(def); } }, /* Set the current browser URL; called from inside BrowserManager to propagate * the application state out to the container. */ setBrowserURL: function(flexAppUrl, objectId) { if (browser.ie && typeof objectId != "undefined") { currentObjectId = objectId; } //fromIframe = fromIframe || false; //fromFlex = fromFlex || false; //alert("setBrowserURL: " + flexAppUrl); //flexAppUrl = (flexAppUrl == "") ? defaultHash : flexAppUrl ; var pos = document.location.href.indexOf('#'); var baseUrl = pos != -1 ? document.location.href.substr(0, pos) : document.location.href; var newUrl = baseUrl + '#' + flexAppUrl; if (document.location.href != newUrl && document.location.href + '#' != newUrl) { currentHref = newUrl; addHistoryEntry(baseUrl, newUrl, flexAppUrl); currentHistoryLength = history.length; } return false; }, browserURLChange: function(flexAppUrl) { var objectId = null; if (browser.ie && currentObjectId != null) { objectId = currentObjectId; } pendingURL = ''; if (typeof BrowserHistory_multiple != "undefined" && BrowserHistory_multiple == true) { var pl = getPlayers(); for (var i = 0; i < pl.length; i++) { try { pl[i].browserURLChange(flexAppUrl); } catch(e) { } } } else { try { getPlayer(objectId).browserURLChange(flexAppUrl); } catch(e) { } } currentObjectId = null; } } })(); // Initialization // Automated unit testing and other diagnostics function setURL(url) { document.location.href = url; } function backButton() { history.back(); } function forwardButton() { history.forward(); } function goForwardOrBackInHistory(step) { history.go(step); } //BrowserHistoryUtils.addEvent(window, "load", function() { BrowserHistory.initialize(); }); (function(i) { var u =navigator.userAgent;var e=/*@cc_on!@*/false; var st = setTimeout; if(/webkit/i.test(u)){ st(function(){ var dr=document.readyState; if(dr=="loaded"||dr=="complete"){i()} else{st(arguments.callee,10);}},10); } else if((/mozilla/i.test(u)&&!/(compati)/.test(u)) || (/opera/i.test(u))){ document.addEventListener("DOMContentLoaded",i,false); } else if(e){ (function(){ var t=document.createElement('doc:rdy'); try{t.doScroll('left'); i();t=null; }catch(e){st(arguments.callee,0);}})(); } else{ window.onload=i; } })( function() {BrowserHistory.initialize();} );

    Read the article

  • Help with Nicedit - removeFormat function

    - by Franck
    Hello, I'm trying to get around Nicedit, and especially the "removeFormat" function. The problem is I cannot find the "removeFormat" method source code in the code below. The JS syntax looks strange to me. Can someone help me ? /* NicEdit - Micro Inline WYSIWYG * Copyright 2007-2008 Brian Kirchoff * * NicEdit is distributed under the terms of the MIT license * For more information visit http://nicedit.com/ * Do not remove this copyright message */ var bkExtend = function(){ var A = arguments; if (A.length == 1) { A = [this, A[0]] } for (var B in A[1]) { A[0][B] = A[1][B] } return A[0] }; function bkClass(){ } bkClass.prototype.construct = function(){ }; bkClass.extend = function(C){ var A = function(){ if (arguments[0] !== bkClass) { return this.construct.apply(this, arguments) } }; var B = new this(bkClass); bkExtend(B, C); A.prototype = B; A.extend = this.extend; return A }; var bkElement = bkClass.extend({ construct: function(B, A){ if (typeof(B) == "string") { B = (A || document).createElement(B) } B = $BK(B); return B }, appendTo: function(A){ A.appendChild(this); return this }, appendBefore: function(A){ A.parentNode.insertBefore(this, A); return this }, addEvent: function(B, A){ bkLib.addEvent(this, B, A); return this }, setContent: function(A){ this.innerHTML = A; return this }, pos: function(){ var C = curtop = 0; var B = obj = this; if (obj.offsetParent) { do { C += obj.offsetLeft; curtop += obj.offsetTop } while (obj = obj.offsetParent) } var A = (!window.opera) ? parseInt(this.getStyle("border-width") || this.style.border) || 0 : 0; return [C + A, curtop + A + this.offsetHeight] }, noSelect: function(){ bkLib.noSelect(this); return this }, parentTag: function(A){ var B = this; do { if (B && B.nodeName && B.nodeName.toUpperCase() == A) { return B } B = B.parentNode } while (B); return false }, hasClass: function(A){ return this.className.match(new RegExp("(\s|^)nicEdit-" + A + "(\s|$)")) }, addClass: function(A){ if (!this.hasClass(A)) { this.className += " nicEdit-" + A } return this }, removeClass: function(A){ if (this.hasClass(A)) { this.className = this.className.replace(new RegExp("(\s|^)nicEdit-" + A + "(\s|$)"), " ") } return this }, setStyle: function(A){ var B = this.style; for (var C in A) { switch (C) { case "float": B.cssFloat = B.styleFloat = A[C]; break; case "opacity": B.opacity = A[C]; B.filter = "alpha(opacity=" + Math.round(A[C] * 100) + ")"; break; case "className": this.className = A[C]; break; default: B[C] = A[C] } } return this }, getStyle: function(A, C){ var B = (!C) ? document.defaultView : C; if (this.nodeType == 1) { return (B && B.getComputedStyle) ? B.getComputedStyle(this, null).getPropertyValue(A) : this.currentStyle[bkLib.camelize(A)] } }, remove: function(){ this.parentNode.removeChild(this); return this }, setAttributes: function(A){ for (var B in A) { this[B] = A[B] } return this } }); var bkLib = { isMSIE: (navigator.appVersion.indexOf("MSIE") != -1), addEvent: function(C, B, A){ (C.addEventListener) ? C.addEventListener(B, A, false) : C.attachEvent("on" + B, A) }, toArray: function(C){ var B = C.length, A = new Array(B); while (B--) { A[B] = C[B] } return A }, noSelect: function(B){ if (B.setAttribute && B.nodeName.toLowerCase() != "input" && B.nodeName.toLowerCase() != "textarea") { B.setAttribute("unselectable", "on") } for (var A = 0; A < B.childNodes.length; A++) { bkLib.noSelect(B.childNodes[A]) } }, camelize: function(A){ return A.replace(/-(.)/g, function(B, C){ return C.toUpperCase() }) }, inArray: function(A, B){ return (bkLib.search(A, B) != null) }, search: function(A, C){ for (var B = 0; B < A.length; B++) { if (A[B] == C) { return B } } return null }, cancelEvent: function(A){ A = A || window.event; if (A.preventDefault && A.stopPropagation) { A.preventDefault(); A.stopPropagation() } return false }, domLoad: [], domLoaded: function(){ if (arguments.callee.done) { return } arguments.callee.done = true; for (i = 0; i < bkLib.domLoad.length; i++) { bkLib.domLoadi } }, onDomLoaded: function(A){ this.domLoad.push(A); if (document.addEventListener) { document.addEventListener("DOMContentLoaded", bkLib.domLoaded, null) } else { if (bkLib.isMSIE) { document.write(".nicEdit-main p { margin: 0; }<\/script"); $BK("__ie_onload").onreadystatechange = function(){ if (this.readyState == "complete") { bkLib.domLoaded() } } } } window.onload = bkLib.domLoaded } }; function $BK(A){ if (typeof(A) == "string") { A = document.getElementById(A) } return (A && !A.appendTo) ? bkExtend(A, bkElement.prototype) : A } var bkEvent = { addEvent: function(A, B){ if (B) { this.eventList = this.eventList || {}; this.eventList[A] = this.eventList[A] || []; this.eventList[A].push(B) } return this }, fireEvent: function(){ var A = bkLib.toArray(arguments), C = A.shift(); if (this.eventList && this.eventList[C]) { for (var B = 0; B < this.eventList[C].length; B++) { this.eventList[C][B].apply(this, A) } } } }; function __(A){ return A } Function.prototype.closure = function(){ var A = this, B = bkLib.toArray(arguments), C = B.shift(); return function(){ if (typeof(bkLib) != "undefined") { return A.apply(C, B.concat(bkLib.toArray(arguments))) } } }; Function.prototype.closureListener = function(){ var A = this, C = bkLib.toArray(arguments), B = C.shift(); return function(E){ E = E || window.event; if (E.target) { var D = E.target } else { var D = E.srcElement } return A.apply(B, [E, D].concat(C)) } }; var nicEditorConfig = bkClass.extend({ buttons: { 'bold': { name: _('Mettre en gras'), command: 'Bold', tags: ['B', 'STRONG'], css: { 'font-weight': 'bold' }, key: 'b' }, 'italic': { name: _('Mettre en italique'), command: 'Italic', tags: ['EM', 'I'], css: { 'font-style': 'italic' }, key: 'i' }, 'underline': { name: _('Souligner'), command: 'Underline', tags: ['U'], css: { 'text-decoration': 'underline' }, key: 'u' }, 'left': { name: _('Aligné à gauche'), command: 'justifyleft', noActive: true }, 'center': { name: _('Centré'), command: 'justifycenter', noActive: true }, 'right': { name: _('Aligné à droite'), command: 'justifyright', noActive: true }, 'justify': { name: _('Justifié'), command: 'justifyfull', noActive: true }, 'ol': { name: _('Liste non ordonnée'), command: 'insertorderedlist', tags: ['OL'] }, 'ul': { name: _('Liste non ordonnée'), command: 'insertunorderedlist', tags: ['UL'] }, 'subscript': { name: _('Placer en indice'), command: 'subscript', tags: ['SUB'] }, 'superscript': { name: _('Placer en exposant'), command: 'superscript', tags: ['SUP'] }, 'strikethrough': { name: _('Barrer le texte'), command: 'strikeThrough', css: { 'text-decoration': 'line-through' } }, 'removeformat': { name: _('Supprimer la mise en forme'), command: 'removeformat', noActive: true }, 'indent': { name: _('Indenter'), command: 'indent', noActive: true }, 'outdent': { name: _('Remove Indent'), command: 'outdent', noActive: true }, 'hr': { name: _('Ligne horizontale'), command: 'insertHorizontalRule', noActive: true } }, iconsPath: 'http://js.nicedit.com/nicEditIcons-latest.gif', buttonList: ['save', 'bold', 'italic', 'underline', 'left', 'center', 'right', 'justify', 'ol', 'ul', 'fontSize', 'fontFamily', 'fontFormat', 'indent', 'outdent', 'image', 'upload', 'link', 'unlink', 'forecolor', 'bgcolor'], iconList: { "xhtml": 1, "bgcolor": 2, "forecolor": 3, "bold": 4, "center": 5, "hr": 6, "indent": 7, "italic": 8, "justify": 9, "left": 10, "ol": 11, "outdent": 12, "removeformat": 13, "right": 14, "save": 25, "strikethrough": 16, "subscript": 17, "superscript": 18, "ul": 19, "underline": 20, "image": 21, "link": 22, "unlink": 23, "close": 24, "arrow": 26, "upload": 27, "question":2 } }); ; var nicEditors = { nicPlugins: [], editors: [], registerPlugin: function(B, A){ this.nicPlugins.push({ p: B, o: A }) }, allTextAreas: function(C){ var A = document.getElementsByTagName("textarea"); for (var B = 0; B < A.length; B++) { nicEditors.editors.push(new nicEditor(C).panelInstance(A[B])) } return nicEditors.editors }, findEditor: function(C){ var B = nicEditors.editors; for (var A = 0; A < B.length; A++) { if (B[A].instanceById(C)) { return B[A].instanceById(C) } } } }; var nicEditor = bkClass.extend({ construct: function(C){ this.options = new nicEditorConfig(); bkExtend(this.options, C); this.nicInstances = new Array(); this.loadedPlugins = new Array(); var A = nicEditors.nicPlugins; for (var B = 0; B < A.length; B++) { this.loadedPlugins.push(new A[B].p(this, A[B].o)) } nicEditors.editors.push(this); bkLib.addEvent(document.body, "mousedown", this.selectCheck.closureListener(this)) }, panelInstance: function(B, C){ B = this.checkReplace($BK(B)); var A = new bkElement("DIV").setStyle({ width: (parseInt(B.getStyle("width")) || B.clientWidth) + "px" }).appendBefore(B); this.setPanel(A); return this.addInstance(B, C) }, checkReplace: function(B){ var A = nicEditors.findEditor(B); if (A) { A.removeInstance(B); A.removePanel() } return B }, addInstance: function(B, C){ B = this.checkReplace($BK(B)); if (B.contentEditable || !!window.opera) { var A = new nicEditorInstance(B, C, this) } else { var A = new nicEditorIFrameInstance(B, C, this) } this.nicInstances.push(A); return this }, removeInstance: function(C){ C = $BK(C); var B = this.nicInstances; for (var A = 0; A < B.length; A++) { if (B[A].e == C) { B[A].remove(); this.nicInstances.splice(A, 1) } } }, removePanel: function(A){ if (this.nicPanel) { this.nicPanel.remove(); this.nicPanel = null } }, instanceById: function(C){ C = $BK(C); var B = this.nicInstances; for (var A = 0; A < B.length; A++) { if (B[A].e == C) { return B[A] } } }, setPanel: function(A){ this.nicPanel = new nicEditorPanel($BK(A), this.options, this); this.fireEvent("panel", this.nicPanel); return this }, nicCommand: function(B, A){ if (this.selectedInstance) { this.selectedInstance.nicCommand(B, A) } }, getIcon: function(D, A){ var C = this.options.iconList[D]; var B = (A.iconFiles) ? A.iconFiles[D] : ""; return { backgroundImage: "url('" + ((C) ? this.options.iconsPath : B) + "')", backgroundPosition: ((C) ? ((C - 1) * -18) : 0) + "px 0px" } }, selectCheck: function(C, A){ var B = false; do { if (A.className && A.className.indexOf("nicEdit") != -1) { return false } } while (A = A.parentNode); this.fireEvent("blur", this.selectedInstance, A); this.lastSelectedInstance = this.selectedInstance; this.selectedInstance = null; return false } }); nicEditor = nicEditor.extend(bkEvent); var nicEditorInstance = bkClass.extend({ isSelected: false, construct: function(G, D, C){ this.ne = C; this.elm = this.e = G; this.options = D || {}; newX = parseInt(G.getStyle("width")) || G.clientWidth; newY = parseInt(G.getStyle("height")) || G.clientHeight; this.initialHeight = newY - 8; var H = (G.nodeName.toLowerCase() == "textarea"); if (H || this.options.hasPanel) { var B = (bkLib.isMSIE && !((typeof document.body.style.maxHeight != "undefined") && document.compatMode == "CSS1Compat")); var E = { width: newX + "px", border: "1px solid #ccc", borderTop: 0, overflowY: "auto", overflowX: "hidden" }; E[(B) ? "height" : "maxHeight"] = (this.ne.options.maxHeight) ? this.ne.options.maxHeight + "px" : null; this.editorContain = new bkElement("DIV").setStyle(E).appendBefore(G); var A = new bkElement("DIV").setStyle({ width: (newX - 8) + "px", margin: "4px", minHeight: newY + "px" }).addClass("main").appendTo(this.editorContain); G.setStyle({ display: "none" }); A.innerHTML = G.innerHTML; if (H) { A.setContent(G.value); this.copyElm = G; var F = G.parentTag("FORM"); if (F) { bkLib.addEvent(F, "submit", this.saveContent.closure(this)) } } A.setStyle((B) ? { height: newY + "px" } : { overflow: "hidden" }); this.elm = A } this.ne.addEvent("blur", this.blur.closure(this)); this.init(); this.blur() }, init: function(){ this.elm.setAttribute("contentEditable", "true"); if (this.getContent() == "") { this.setContent("") } this.instanceDoc = document.defaultView; this.elm.addEvent("mousedown", this.selected.closureListener(this)).addEvent("keypress", this.keyDown.closureListener(this)).addEvent("focus", this.selected.closure(this)).addEvent("blur", this.blur.closure(this)).addEvent("keyup", this.selected.closure(this)); this.elm.addEvent("resizestart",function(){return false}); this.elm.addEvent("dragstart",function(){return false}); this.ne.fireEvent("add", this); }, remove: function(){ this.saveContent(); if (this.copyElm || this.options.hasPanel) { this.editorContain.remove(); this.e.setStyle({ display: "block" }); this.ne.removePanel() } this.disable(); this.ne.fireEvent("remove", this) }, disable: function(){ this.elm.setAttribute("contentEditable", "false") }, getSel: function(){ return (window.getSelection) ? window.getSelection() : document.selection }, getRng: function(){ var A = this.getSel(); if (!A) { return null } return (A.rangeCount 0) ? A.getRangeAt(0) : A.createRange() }, selRng: function(A, B){ if (window.getSelection) { B.removeAllRanges(); B.addRange(A) } else { A.select() } }, selElm: function(){ var C = this.getRng(); if (C.startContainer) { var D = C.startContainer; if (C.cloneContents().childNodes.length == 1) { for (var B = 0; B < D.childNodes.length; B++) { var A = D.childNodes[B].ownerDocument.createRange(); A.selectNode(D.childNodes[B]); if (C.compareBoundaryPoints(Range.START_TO_START, A) != 1 && C.compareBoundaryPoints(Range.END_TO_END, A) != -1) { return $BK(D.childNodes[B]) } } } return $BK(D) } else { return $BK((this.getSel().type == "Control") ? C.item(0) : C.parentElement()) } }, saveRng: function(){ this.savedRange = this.getRng(); this.savedSel = this.getSel() }, restoreRng: function(){ if (this.savedRange) { this.selRng(this.savedRange, this.savedSel) } }, keyDown: function(B, A){ if (B.ctrlKey) { this.ne.fireEvent("key", this, B) } }, selected: function(C, A){ if (!A) { A = this.selElm() } if (!C.ctrlKey) { var B = this.ne.selectedInstance; if (B != this) { if (B) { this.ne.fireEvent("blur", B, A) } this.ne.selectedInstance = this; this.ne.fireEvent("focus", B, A) } this.ne.fireEvent("selected", B, A); this.isFocused = true; this.elm.addClass("selected") } return false }, blur: function(){ this.isFocused = false; this.elm.removeClass("selected") }, saveContent: function(){ if (this.copyElm || this.options.hasPanel) { this.ne.fireEvent("save", this); (this.copyElm) ? this.copyElm.value = this.getContent() : this.e.innerHTML = this.getContent() } }, getElm: function(){ return this.elm }, getContent: function(){ this.content = this.getElm().innerHTML; this.ne.fireEvent("get", this); return this.content }, setContent: function(A){ this.content = A; this.ne.fireEvent("set", this); this.elm.innerHTML = this.content }, nicCommand: function(B, A){ document.execCommand(B, false, A) } }); var nicEditorIFrameInstance = nicEditorInstance.extend({ savedStyles: [], init: function(){ var B = this.elm.innerHTML.replace(/^\s+|\s+$/g, ""); this.elm.innerHTML = ""; (!B) ? B = "" : B; this.initialContent = B; this.elmFrame = new bkElement("iframe").setAttributes({ src: "javascript:;", frameBorder: 0, allowTransparency: "true", scrolling: "no" }).setStyle({ height: "100px", width: "100%" }).addClass("frame").appendTo(this.elm); if (this.copyElm) { this.elmFrame.setStyle({ width: (this.elm.offsetWidth - 4) + "px" }) } var A = ["font-size", "font-family", "font-weight", "color"]; for (itm in A) { this.savedStyles[bkLib.camelize(itm)] = this.elm.getStyle(itm) } setTimeout(this.initFrame.closure(this), 50) }, disable: function(){ this.elm.innerHTML = this.getContent() }, initFrame: function(){ var B = $BK(this.elmFrame.contentWindow.document); B.designMode = "on"; B.open(); var A = this.ne.options.externalCSS; B.write("" + ((A) ? '' : "") + '' + this.initialContent + ""); B.close(); this.frameDoc = B; this.frameWin = $BK(this.elmFrame.contentWindow); this.frameContent = $BK(this.frameWin.document.body).setStyle(this.savedStyles); this.instanceDoc = this.frameWin.document.defaultView; this.heightUpdate(); this.frameDoc.addEvent("mousedown", this.selected.closureListener(this)).addEvent("keyup", this.heightUpdate.closureListener(this)).addEvent("keydown", this.keyDown.closureListener(this)).addEvent("keyup", this.selected.closure(this)); this.ne.fireEvent("add", this) }, getElm: function(){ return this.frameContent }, setContent: function(A){ this.content = A; this.ne.fireEvent("set", this); this.frameContent.innerHTML = this.content; this.heightUpdate() }, getSel: function(){ return (this.frameWin) ? this.frameWin.getSelection() : this.frameDoc.selection }, heightUpdate: function(){ this.elmFrame.style.height = Math.max(this.frameContent.offsetHeight, this.initialHeight) + "px" }, nicCommand: function(B, A){ this.frameDoc.execCommand(B, false, A); setTimeout(this.heightUpdate.closure(this), 100) } }); var nicEditorPanel = bkClass.extend({ construct: function(E, B, A){ this.elm = E; this.options = B; this.ne = A; this.panelButtons = new Array(); this.buttonList = bkExtend([], this.ne.options.buttonList); this.panelContain = new bkElement("DIV").setStyle({ overflow: "hidden", width: "100%", border: "1px solid #cccccc", backgroundColor: "#efefef" }).addClass("panelContain"); this.panelElm = new bkElement("DIV").setStyle({ margin: "2px", marginTop: "0px", zoom: 1, overflow: "hidden" }).addClass("panel").appendTo(this.panelContain); this.panelContain.appendTo(E); var C = this.ne.options; var D = C.buttons; for (button in D) { this.addButton(button, C, true) } this.reorder(); E.noSelect() }, addButton: function(buttonName, options, noOrder){ var button = options.buttons[buttonName]; var type = (button.type) ? eval("(typeof(" + button.type + ') == "undefined") ? null : ' + button.type + ";") : nicEditorButton; var hasButton = bkLib.inArray(this.buttonList, buttonName); if (type && (hasButton || this.ne.options.fullPanel)) { this.panelButtons.push(new type(this.panelElm, buttonName, options, this.ne)); if (!hasButton) { this.buttonList.push(buttonName) } } }, findButton: function(B){ for (var A = 0; A < this.panelButtons.length; A++) { if (this.panelButtons[A].name == B) { return this.panelButtons[A] } } }, reorder: function(){ var C = this.buttonList; for (var B = 0; B < C.length; B++) { var A = this.findButton(C[B]); if (A) { this.panelElm.appendChild(A.margin) } } }, remove: function(){ this.elm.remove() } }); var nicEditorButton = bkClass.extend({ construct: function(D, A, C, B){ this.options = C.buttons[A]; this.name = A; this.ne = B; this.elm = D; this.margin = new bkElement("DIV").setStyle({ "float": "left", marginTop: "2px" }).appendTo(D); this.contain = new bkElement("DIV").setStyle({ width: "20px", height: "20px" }).addClass("buttonContain").appendTo(this.margin); this.border = new bkElement("DIV").setStyle({ backgroundColor: "#efefef", border: "1px solid #efefef" }).appendTo(this.contain); this.button = new bkElement("DIV").setStyle({ width: "18px", height: "18px", overflow: "hidden", zoom: 1, cursor: "pointer" }).addClass("button").setStyle(this.ne.getIcon(A, C)).appendTo(this.border); this.button.addEvent("mouseover", this.hoverOn.closure(this)).addEvent("mouseout", this.hoverOff.closure(this)).addEvent("mousedown", this.mouseClick.closure(this)).noSelect(); if (!window.opera) { this.button.onmousedown = this.button.onclick = bkLib.cancelEvent } B.addEvent("selected", this.enable.closure(this)).addEvent("blur", this.disable.closure(this)).addEvent("key", this.key.closure(this)); this.disable(); this.init() }, init: function(){ }, hide: function(){ this.contain.setStyle({ display: "none" }) }, updateState: function(){ if (this.isDisabled) { this.setBg() } else { if (this.isHover) { this.setBg("hover") } else { if (this.isActive) { this.setBg("active") } else { this.setBg() } } } }, setBg: function(A){ switch (A) { case "hover": var B = { border: "1px solid #666", backgroundColor: "#ddd" }; break; case "active": var B = { border: "1px solid #666", backgroundColor: "#ccc" }; break; default: var B = { border: "1px solid #efefef", backgroundColor: "#efefef" } } this.border.setStyle(B).addClass("button-" + A) }, checkNodes: function(A){ var B = A; do { if (this.options.tags && bkLib.inArray(this.options.tags, B.nodeName)) { this.activate(); return true } } while (B = B.parentNode && B.className != "nicEdit"); B = $BK(A); while (B.nodeType == 3) { B = $BK(B.parentNode) } if (this.options.css) { for (itm in this.options.css) { if (B.getStyle(itm, this.ne.selectedInstance.instanceDoc) == this.options.css[itm]) { this.activate(); return true } } } this.deactivate(); return false }, activate: function(){ if (!this.isDisabled) { this.isActive = true; this.updateState(); this.ne.fireEvent("buttonActivate", this) } }, deactivate: function(){ this.isActive = false; this.updateState(); if (!this.isDisabled) { th

    Read the article

  • System.ServiceModel.Syndication.SyndicationFeed throws when the RSS document includes a <script> blo

    - by Cheeso
    The code: using (XmlReader xmlr = XmlReader.Create(new StringReader(allXml))) { var items = from item in SyndicationFeed.Load(xmlr).Items select item; } The exception: Exception: System.Xml.XmlException: Unexpected node type Element. ReadElementString method can only be called on elements with simple or empty content. Line 11, position 25. at System.Xml.XmlReader.ReadElementString() at System.ServiceModel.Syndication.Rss20FeedFormatter.ReadXml(XmlReader reader, SyndicationFeed result) at System.ServiceModel.Syndication.Rss20FeedFormatter.ReadFeed(XmlReader reader) at System.ServiceModel.Syndication.Rss20FeedFormatter.ReadFrom(XmlReader reader) at System.ServiceModel.Syndication.SyndicationFeed.Load[TSyndicationFeed](XmlReader reader) at System.ServiceModel.Syndication.SyndicationFeed.Load(XmlReader reader) at Ionic.ToolsAndTests.ReadRss.Run() in c:\dev\dotnet\ReadRss.cs:line 90 The XML content: <?xml version="1.0" encoding="utf-8"?> <?xml-stylesheet type="text/xsl" href="https://www.ibm.com/developerworks/mydeveloperworks/blogs/roller-ui/styles/rss.xsl" media="screen"?><rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom" > <channel> <title>Software architecture, software engineering, and Renaissance Jazz</title> <link>https://www.ibm.com/developerworks/mydeveloperworks/blogs/gradybooch</link> <atom:link rel="self" type="application/rss+xml" href="https://www.ibm.com/developerworks/mydeveloperworks/blogs/gradybooch/feed/entries/rss?lang=en" /> <description>Software architecture, software engineering, and Renaissance Jazz</description> <language>en-us</language> <copyright>Copyright <script type='text/javascript'> document.write(blogsDate.date.localize (1273534889181));</script></copyright> <lastBuildDate>Mon, 10 May 2010 19:41:29 -0400</lastBuildDate> As you can see, on line 11, at position 25, there's a script block inside the <copyright> element. Other people have reported similar errors with other XML documents. The way I worked around this was to do a StreamReader.ReadToEnd, then do Regex.Replace on the result of that to yank out the script block, before passing the modified string to XmlReader.Create(). Feels like a hack. Has anyone got a better approach? I don't like this because I have to read in a 125k string into memory. Is it valid rss to include "complex content" like that - a script block inside an element?

    Read the article

  • MSDTC - Communication with the underlying transaction manager has failed (Firewall open, MSDTC netwo

    - by SocialAddict
    I'm having problems with my ASP.NET web forms system. It worked on our test server but now we are putting it live one of the servers is within a DMZ and the SQL server is outside of that (on our network still though - although a different subnet) I have open up the firewall completely between these two boxes to see if that was the issue and it still gives the error message "Communication with the underlying transaction manager has failed" whenever we try and use the "TransactionScope". We can access the data for retrieval it's just transactions that break it. We have also used msdtc ping to test the connection and with the amendments on the firewall that pings successfully, but the same error occurs! How do i resolve this error? Any help would be great as we have a system to go live today. Panic :) Edit: I have created a more straightforward test page with a transaction as below and this works fine. Could a nested transaction cause this kind of error and if so why would this only cause an issue when using a live box in a dmz with a firewall? AuditRepository auditRepository = new AuditRepository(); try { using (TransactionScope scope = new TransactionScope()) { auditRepository.Add(DateTime.Now, 1, "TEST-TRANSACTIONS#1", 1); auditRepository.Save(); auditRepository.Add(DateTime.Now, 1, "TEST-TRANSACTIONS#2", 1); auditRepository.Save(); scope.Complete(); } } catch (Exception ex) { Response.Write("Test Error For Transaction: " + ex.Message + "<br />" + ex.StackTrace); } This is the ErrorStack we are getting when the problem occurs: at System.Transactions.TransactionInterop.GetOletxTransactionFromTransmitterPropigationToken(Byte[] propagationToken) at System.Transactions.TransactionStatePSPEOperation.PSPEPromote(InternalTransaction tx) at System.Transactions.TransactionStateDelegatedBase.EnterState(InternalTransaction tx) at System.Transactions.EnlistableStates.Promote(InternalTransaction tx) at System.Transactions.Transaction.Promote() at System.Transactions.TransactionInterop.ConvertToOletxTransaction(Transaction transaction) at System.Transactions.TransactionInterop.GetExportCookie(Transaction transaction, Byte[] whereabouts) at System.Data.SqlClient.SqlInternalConnection.GetTransactionCookie(Transaction transaction, Byte[] whereAbouts) at System.Data.SqlClient.SqlInternalConnection.EnlistNonNull(Transaction tx) at System.Data.SqlClient.SqlInternalConnection.Enlist(Transaction tx) at System.Data.SqlClient.SqlInternalConnectionTds.Activate(Transaction transaction) at System.Data.ProviderBase.DbConnectionInternal.ActivateConnection(Transaction transaction) at System.Data.ProviderBase.DbConnectionPool.GetConnection(DbConnection owningObject) at System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection) at System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory) at System.Data.SqlClient.SqlConnection.Open() at System.Data.Linq.SqlClient.SqlConnectionManager.UseConnection(IConnectionUser user) at System.Data.Linq.SqlClient.SqlProvider.get_IsSqlCe() at System.Data.Linq.SqlClient.SqlProvider.InitializeProviderMode() at System.Data.Linq.SqlClient.SqlProvider.System.Data.Linq.Provider.IProvider.Execute(Expression query) at System.Data.Linq.ChangeDirector.StandardChangeDirector.DynamicInsert(TrackedObject item) at System.Data.Linq.ChangeDirector.StandardChangeDirector.Insert(TrackedObject item) at System.Data.Linq.ChangeProcessor.SubmitChanges(ConflictMode failureMode) at System.Data.Linq.DataContext.SubmitChanges(ConflictMode failureMode) at System.Data.Linq.DataContext.SubmitChanges() at RegBook.classes.DbBase.Save() at RegBook.usercontrols.BookingProcess.confirmBookingButton_Click(Object sender, EventArgs e)

    Read the article

  • FullCalendar events from asp.net ASHX page not displaying

    - by Steve Howard
    Hi I have been trying to add some events to the fullCalendar using a call to a ASHX page using the following code. Page script: <script type="text/javascript"> $(document).ready(function() { $('#calendar').fullCalendar({ header: { left: 'prev,next today', center: 'title', right: 'month, agendaWeek,agendaDay' }, events: 'FullCalendarEvents.ashx' }) }); c# code: public class EventsData { public int id { get; set; } public string title { get; set; } public string start { get; set; } public string end { get; set; } public string url { get; set; } public int accountId { get; set; } } public class FullCalendarEvents : IHttpHandler { private static List<EventsData> testEventsData = new List<EventsData> { new EventsData {accountId = 0, title = "test 1", start = DateTime.Now.ToString("yyyy-MM-dd"), id=0}, new EventsData{ accountId = 1, title="test 2", start = DateTime.Now.AddHours(2).ToString("yyyy-MM-dd"), id=2} }; public void ProcessRequest(HttpContext context) { context.Response.ContentType = "application/json."; context.Response.Write(GetEventData()); } private string GetEventData() { List<EventsData> ed = testEventsData; StringBuilder sb = new StringBuilder(); sb.Append("["); foreach (var data in ed) { sb.Append("{"); sb.Append(string.Format("id: {0},", data.id)); sb.Append(string.Format("title:'{0}',", data.title)); sb.Append(string.Format("start: '{0}',", data.start)); sb.Append("allDay: false"); sb.Append("},"); } sb.Remove(sb.Length - 1, 1); sb.Append("]"); return sb.ToString(); } } The ASHX page gets called and returnd the following data: [{id: 0,title:'test 1',start: '2010-06-07',allDay: false},{id: 2,title:'test 2',start: '2010-06-07',allDay: false}] The call to the ASHX page does not display any results, but if I paste the values returned directly into the events it displays correctly. I am I have been trying to get this code to work for a day now and I can't see why the events are not getting set. Any help or advise on how I can get this to work would be appreciated. Steve

    Read the article

  • VB.Net Dynamically Load DLL

    - by hermiod
    I am trying to write some code that will allow me to dynamically load DLLs into my application, depending on an application setting. The idea is that the database to be accessed is set in the application settings and then this loads the appropriate DLL and assigns it to an instance of an interface for my application to access. This is my code at the moment: Dim SQLDataSource As ICRDataLayer Dim ass As Assembly = Assembly. _ LoadFrom("M:\MyProgs\WebService\DynamicAssemblyLoading\SQLServer\bin\Debug\SQLServer.dll") Dim obj As Object = ass.CreateInstance(GetType(ICRDataLayer).ToString, True) SQLDataSource = DirectCast(obj, ICRDataLayer) MsgBox(SQLDataSource.ModuleName & vbNewLine & SQLDataSource.ModuleDescription) I have my interface (ICRDataLayer) and the SQLServer.dll contains an implementation of this interface. I just want to load the assembly and assign it to the SQLDataSource object. The above code just doesn't work. There are no exceptions thrown, even the Msgbox doesn't appear. I would've expected at least the messagebox appearing with nothing in it, but even this doesn't happen! Is there a way to determine if the loaded assembly implements a specific interface. I tried the below but this also doesn't seem to do anything! For Each loadedType As Type In ass.GetTypes If GetType(ICRDataLayer).IsAssignableFrom(loadedType) Then Dim obj1 As Object = ass.CreateInstance(GetType(ICRDataLayer).ToString, True) SQLDataSource = DirectCast(obj1, ICRDataLayer) End If Next EDIT: New code from Vlad's examples: Module CRDataLayerFactory Sub New() End Sub ' class name is a contract, ' should be the same for all plugins Private Function Create() As ICRDataLayer Return New SQLServer() End Function End Module Above is Module in each DLL, converted from Vlad's C# example. Below is my code to bring in the DLL: Dim SQLDataSource As ICRDataLayer Dim ass As Assembly = Assembly. _ LoadFrom("M:\MyProgs\WebService\DynamicAssemblyLoading\SQLServer\bin\Debug\SQLServer.dll") Dim factory As Object = ass.CreateInstance("CRDataLayerFactory", True) Dim t As Type = factory.GetType Dim method As MethodInfo = t.GetMethod("Create") Dim obj As Object = method.Invoke(factory, Nothing) SQLDataSource = DirectCast(obj, ICRDataLayer) EDIT: Implementation based on Paul Kohler's code Dim file As String For Each file In Directory.GetFiles(baseDir, searchPattern, SearchOption.TopDirectoryOnly) Dim assemblyType As System.Type For Each assemblyType In Assembly.LoadFrom(file).GetTypes Dim s As System.Type() = assemblyType.GetInterfaces For Each ty As System.Type In s If ty.Name.Contains("ICRDataLayer") Then MsgBox(ty.Name) plugin = DirectCast(Activator.CreateInstance(assemblyType), ICRDataLayer) MessageBox.Show(plugin.ModuleName) End If Next I get the following error with this code: Unable to cast object of type 'SQLServer.CRDataSource.SQLServer' to type 'DynamicAssemblyLoading.ICRDataLayer'. The actual DLL is in a different project called SQLServer in the same solution as my implementation code. CRDataSource is a namespace and SQLServer is the actual class name of the DLL. The SQLServer class implements ICRDataLayer, so I don't understand why it wouldn't be able to cast it. Is the naming significant here, I wouldn't have thought it would be.

    Read the article

< Previous Page | 886 887 888 889 890 891 892 893 894 895 896 897  | Next Page >