Search Results

Search found 451 results on 19 pages for 'filestream'.

Page 10/19 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Will HttpResponse.Filter buffer the whole data before start the sending?

    - by vtortola
    Hi, An user posts this article about how to use HttpResponse.Filter to compress large amounts of data. But what will happen if I try to transfer a 4G file? will it load the whole file in memory in order to compress it? or otherwise it will compress it chunk by chunk? I mean, I'm doing this right now: public void GetFile(HttpResponse response) { String fileName = "example.iso"; response.ClearHeaders(); response.ClearContent(); response.ContentType = "application/octet-stream"; response.AppendHeader("Content-Disposition", "attachment; filename=" + fileName); response.AppendHeader("Content-Length", new FileInfo(fileName).Length.ToString()); using (FileStream fs = new FileStream(Path.Combine(HttpContext.Current.Server.MapPath("~/App_Data"), fileName), FileMode.Open)) using (DeflateStream ds = new DeflateStream(fs,CompressionMode.Compress)) { Byte[] buffer = new Byte[4096]; Int32 readed = 0; while ((readed = ds.Read(buffer, 0, buffer.Length)) > 0) { response.OutputStream.Write(buffer, 0, readed); response.Flush(); } } } So at the same time I'm reading, I'm compressing and sending it. Then I wanna know if HttpResponse do the same thing, or otherwise it will load the whole file in memory in order to compress it. Cheers.

    Read the article

  • SQL Server slow in production environment

    - by Lieven Cardoen
    I have a weird problem in a customer's production environment. I can't give any details on the infrastructure, except that SQL server runs on a virtual server. The data, log and filestream file are on another storage server (data and filestream together and log on a separate server). In our local Test environment, there's one particular query that executes with these durations: first we clear the cache 300ms (First time it takes longer, but from then on it's cached.) 20ms 15ms 17ms In the customer's production environment, the SQL Server is more powerful, these are the durations (I didn't have the rights to clear the cache. Will try this tomorrow). 2500ms 2600ms 2400ms The servers in the customer's production environment are more powerful but they do have virtual servers (we don't). What could be the cause... Not enough memory? Fragmentation? Physical storage? How would you tackle this performance problem? EDIT: Some people have asked me if the data set is equal and it is. I restored their database on our environment. It's true that this was the first thing I looked at. (@Everyone: I added the edit because it will be the first thing that many will think off).

    Read the article

  • Problem with WCF Streaming

    - by H4mm3rHead
    Hi, I was looking at this thread: http://stackoverflow.com/questions/1935040/how-to-handle-large-file-uploads-via-wcf I need to have a web service hosted at my provider where i need to upload and download files to. We are talking videos from 1Mb to 100Mb hence the streaming approach. I cant get it to work, i declared an Interface: [ServiceContract] public interface IFileTransferService { [OperationContract] void UploadFile(Stream stream); } and all is fine, i implement it like this: public string FileName = "test"; public void UploadFile(Stream stream) { try { FileStream outStream = File.Open(FileName, FileMode.Create, FileAccess.Write); const int bufferLength = 4096; byte[] buffer = new byte[bufferLength]; int count = 0; while((count = stream.Read(buffer, 0, bufferLength)) > 0) { //progress outStream.Write(buffer, 0, count); } outStream.Close(); stream.Close(); //saved } catch(Exception ex) { throw new Exception("error: "+ex.Message); } } Still no problem, its published to my webserver out on the interweb. So far so good. Now i make a reference to it and will pass it a FileStream, but the argument is now a byte[] - why is that and how do i get it the proper way for streaming? Edit My binding look like this: <bindings> <basicHttpBinding> <binding name="StreamingFileTransferServicesBinding" transferMode="StreamedRequest" maxBufferSize="65536" maxReceivedMessageSize="204003200" /> </basicHttpBinding> </bindings> I can consume it without problems, and get no errors - other than my input parameter has changed from a stream to a byte[]

    Read the article

  • Dealing with SerializationExceptions in C#

    - by Tony
    I get a SerializationException: (...) is not marked as serializable. error in the following code: [Serializable] public class Wind { public MyText text; public Size MSize; public Point MLocation; public int MNumber; /.../ } [Serializable] public class MyText { public string MString; public Font MFont; public StringFormat StrFormat; public float MySize; public Color FColor, SColor, TColor; public bool IsRequest; public decimal MWide; /.../ } and the List to be serialized: List<Wind> MyList = new List<Wind>(); Code Snippet: FileStream FS = new FileStream(AppDomain.CurrentDomain.BaseDirectory + "Sticks.dat", FileMode.Create); BinaryFormatter BF = new BinaryFormatter(); BF.Serialize(FS, MyList); FS.Close(); throws an Exception: System.Runtime.Serialization.SerializationException was unhandled Message="Type 'System.Drawing.StringFormat' in Assembly 'System.Drawing, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' is not marked as serializable." How do I solve this problem?

    Read the article

  • Excel 2007 file writer in C# results in a corrupt file

    - by Martin
    Hi, I am using a BinaryReader to read an Excel 2007 file from an Exchange mailbox using a OWA, the file is then written to disk using a BinaryWriter. My problem is that the two files don't match when the writer finishes. Worse still Excel 2007 won't open the writen file. Previously Excel 2003 has had no problem with the solution below. And Excel 2007 doesn't have an issue if the file is an Excel 2003 format file, only if the file format is Excel 2007 (*.xlsx). BinaryReader: using(System.IO.Stream stream = resource.GetInputStream(attachedFiles[k].Address)) { using(System.IO.BinaryReader br = new System.IO.BinaryReader(stream)) { attachment.Data = new byte[attachedFiles[k].Size]; int bufPosn=0, len=0; while ((len = br.Read( attachment.Data, bufPosn, attachment.Data.Length-bufPosn )) > 0) { bufPosn += len; } br.Close(); } } BinaryWriter: FileStream fs = new FileStream(fileName, FileMode.Create); BinaryWriter binWriter = new BinaryWriter(fs); binWriter.Write( content, 0, content.Length ); binWriter.Close(); fs.Close(); Suggestions gratfully received.

    Read the article

  • Cast to delegate type fails in JScript.NET

    - by dnewcome
    I am trying to do async IO using BeginRead() in JScript.NET, but I can't get the callback function to work correctly. Here is the code: function readFileAsync() { var fs : FileStream = new FileStream( 'test.txt', FileMode.Open, FileAccess.Read ); var result : IAsyncResult = fs.BeginRead( new byte[8], 0, 8, readFileCallback ), fs ); Thread.Sleep( Timeout.Infinite ); } var readFileCallback = function( result : IAsyncResult ) : void { print( 'ListenerCallback():' ); } The exception is a cast failure: Unhandled Exception: System.InvalidCastException: Unable to cast object of type 'Microsoft.JScript.Closure' to type 'System.AsyncCallback'. at JScript 0.readFileAsync(Object this, VsaEngine vsa Engine) at JScript 0.Global Code() at JScript Main.Main(String[] ) I have tried doing an explicit cast both to AsyncCallback and to the base MulticastDelegate and Delegate types to no avail. Delegates are supposed to be created automatically, obviating the need for creating a new AsyncCallback explicitly, eg: BeginRead( ... new AsyncDelegate( readFileCallback), object ); And in fact if you try to create the delegate explicitly the compiler issues an error. I must be missing something here.

    Read the article

  • Send Email from worker role (Azure) with attachment in c#

    - by simplyvaibh
    I am trying to send an email(in c#) from worker role(Azure) with an attachment(from blob storage). I am able to send an email but attachment(word document) is blank. The following function is called from worker role. public void sendMail(string blobName) { InitStorage();//Initialize the storage var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString"); container = blobStorage.GetContainerReference("Container Name"); CloudBlockBlob blob = container.GetBlockBlobReference(blobName); if (File.Exists("demo.doc")) File.Delete("demo.doc"); FileStream fs = new FileStream("demo.doc", FileMode.OpenOrCreate); blob.DownloadToStream(fs); Attachment attach = new Attachment(fs,"Report.doc"); System.Net.Mail.MailMessage Email = new System.Net.Mail.MailMessage("[email protected]", "[email protected]"); Email.Subject = "Text fax send via email"; Email.Subject = "Subject Of email"; Email.Attachments.Add(attach); Email.Body = "Body of email"; System.Net.Mail.SmtpClient client = new SmtpClient("smtp.live.com", 25); client.DeliveryMethod = SmtpDeliveryMethod.Network; client.EnableSsl = true; client.Credentials = new NetworkCredential("[email protected]", Password); client.Send(Email); fs.Flush(); fs.Close(); Email.Dispose(); } Please tell me where I am doing wrong?

    Read the article

  • XML serialization of a collection in C#

    - by Archana R
    I have two classes as follows: public class Info { [XmlAttribute] public string language; public int version; public Book book; public Info() { } public Info(string l, int v, string author, int quantity, int price) { this.language = l; this.version = v; book = new Book(author, quantity, price); } } public class Book { [XmlAttribute] public string author; public int quantity; public int price; [XmlIgnore]public int total; public NameValueCollection nvcollection = new NameValueCollection(); public Book() { } public Book(string author, int quantity, int price) { this.author = author; this.quantity = quantity; this.price = price; total = quantity * price; nvcollection.Add(author, price.ToString()); } } I have created an ArrayList which adds the two instances of Info class as follows: FileStream fs = new FileStream("SerializedInfo.XML", FileMode.Create); List<Info> arrList = new List<Info>(); XmlSerializer xs = new XmlSerializer(typeof(List<Info>)); Info pObj = new Info("ABC", 3, "DEF", 2, 6); Info pObj1 = new Info("GHI", 4, "JKL", 2, 8); arrList.Add(pObj); arrList.Add(pObj1); xs.Serialize(fs, arrList); fs.Close(); But when I try to serialize, I get an exception as "There was an error reflecting type 'System.Collections.Generic.List`1[ConsoleApplicationSerialization.Info]'." Can anyone help me with it? Also, instead of namevaluecollection, which type of structure can i use?

    Read the article

  • Why is my GZipStream not writeable?

    - by Ozzah
    I have some GZ compressed resources in my program and I need to be able to write them out to temporary files for use. I wrote the following function to write the files out and return true on success or false on failure. In addition, I've put a try/catch in there which shows a MessageBox in the event of an error: private static bool extractCompressedResource(byte[] resource, string path) { try { using (MemoryStream ms = new MemoryStream(resource)) { using (FileStream fs = new FileStream(path, FileMode.Create, FileAccess.ReadWrite)) { using (GZipStream zs = new GZipStream(fs, CompressionMode.Decompress)) { ms.CopyTo(zs); // Throws exception zs.Close(); ms.Close(); } } } } catch (Exception ex) { MessageBox.Show(ex.Message); // Stream is not writeable return false; } return true; } I've put a comment on the line which throws the exception. If I put a breakpoint on that line and take a look inside the GZipStream then I can see that it's not writeable (which is what's causing the problem). Am I doing something wrong, or is this a limitation of the GZipStream class?

    Read the article

  • File transfer eating alot of CPU

    - by Dan C.
    I'm trying to transfer a file over a IHttpHandler, the code is pretty simple. However when i start a single transfer it uses about 20% of the CPU. If i were to scale this to 20 simultaneous transfers the CPU is very high. Is there a better way I can be doing this to keep the CPU lower? the client code just sends over chunks of the file 64KB at a time. public void ProcessRequest(HttpContext context) { if (context.Request.Params["secretKey"] == null) { } else { accessCode = context.Request.Params["secretKey"].ToString(); } if (accessCode == "test") { string fileName = context.Request.Params["fileName"].ToString(); byte[] buffer = Convert.FromBase64String(context.Request.Form["data"]); string fileGuid = context.Request.Params["smGuid"].ToString(); string user = context.Request.Params["user"].ToString(); SaveFile(fileName, buffer, user); } } public void SaveFile(string fileName, byte[] buffer, string user) { string DirPath = @"E:\Filestorage\" + user + @"\"; if (!Directory.Exists(DirPath)) { Directory.CreateDirectory(DirPath); } string FilePath = @"E:\Filestorage\" + user + @"\" + fileName; FileStream writer = new FileStream(FilePath, File.Exists(FilePath) ? FileMode.Append : FileMode.Create, FileAccess.Write, FileShare.ReadWrite); writer.Write(buffer, 0, buffer.Length); writer.Close(); }

    Read the article

  • Keeping the UI responsive while parsing a very large logfile

    - by Carlos
    I'm writing an app that parses a very large logfile, so that the user can see the contents in a treeview format. I've used a BackGroundWorker to read the file, and as it parses each message, I use a BeginInvoke to get the GUI thread to add a node to my treeview. Unfortunately, there's two issues: The treeview is unresponsive to clicks or scrolls while the file is being parsed. I would like users to be able to examine (ie expand) nodes while the file is parsing, so that they don't have to wait for the whole file to finish parsing. The treeview flickers each time a new node is added. Here's the code inside the form: private void btnChangeDir_Click(object sender, EventArgs e) { OpenFileDialog browser = new OpenFileDialog(); if (browser.ShowDialog() == DialogResult.OK) { tbSearchDir.Text = browser.FileName; BackgroundWorker bgw = new BackgroundWorker(); bgw.DoWork += (ob, evArgs) => ParseFile(tbSearchDir.Text); bgw.RunWorkerAsync(); } } private void ParseFile(string inputfile) { FileStream logFileStream = new FileStream(inputfile, FileMode.Open, FileAccess.Read, FileShare.ReadWrite); StreamReader LogsFile = new StreamReader(logFileStream); while (!LogsFile.EndOfStream) { string Msgtxt = LogsFile.ReadLine(); Message msg = new Message(Msgtxt.Substring(26)); //Reads the text into a class with appropriate members AddTreeViewNode(msg); } } private void AddTreeViewNode(Message msg) { TreeNode newNode = new TreeNode(msg.SeqNum); BeginInvoke(new Action(() => { treeView1.BeginUpdate(); treeView1.Nodes.Add(newNode); treeView1.EndUpdate(); Refresh(); } )); } What needs to be changed?

    Read the article

  • multiple calender in exchange web service

    - by user3559462
    i have multiple calender in my mailbox, i can retrieve only one calender that is main calnder folder using ews api 2.0, now i want whole list of calenders and appointments and meetings in that. like i have three calender one is main calnder 1.Calender(color-code:default) 2.Jorgen(color-code:pink) 3.Soren(color-code: yellow) i can retrieve all the values of main "Calnder", using the below code Folder inbox = Folder.Bind(service, WellKnownFolderName.Calendar); view.PropertySet = new PropertySet(BasePropertySet.IdOnly); // This results in a FindItem call to EWS. FindItemsResults<Item> results = inbox.FindItems(view); i = 1; m = results.TotalCount; if (results.Count() > 0) { foreach (var item in results) { PropertySet props = new PropertySet(AppointmentSchema.MimeContent,AppointmentSchema.ParentFolderId,AppointmentSchema.Id,AppointmentSchema.Categories,AppointmentSchema.Location); // This results in a GetItem call to EWS. var email = Appointment.Bind(service, item.Id, props); string iCalFileName = @"C:\export\appointment" +i ".ics"; // Save as .eml. using (FileStream fs = new FileStream(iCalFileName, FileMode.Create, FileAccess.Write)) { fs.Write(email.MimeContent.Content, 0, email.MimeContent.Content.Length); } i++; } now i want to get all the remaining calender schedules also, i am not able to get is Please help, need it urgently

    Read the article

  • Sending a file over web service from java to .net

    - by Goran
    Hello, I have built .NET 1.1 Web Service which should accept files and save them. Here is the code of the webmethod: [WebMethod] public bool SaveDocument(Byte[] docbinaryarray, string docname) { string dirPath = @"C:\Temp\WSTEST\"; if(!Directory.Exists(dirPath)) { Directory.CreateDirectory(dirPath); } string filePath = dirPath + docname; FileStream objfilestream = new FileStream(filePath, FileMode.Create, FileAccess.ReadWrite); objfilestream.Write(docbinaryarray, 0, docbinaryarray.Length); objfilestream.Close(); return true; } When I make a client in .NET with reference to this Web service everything goes great, but when a college of mine tries to send me a file from a JAVA client I don't get the actuall file. All I get is byte array with only one element. Definition of byte array for file, in WSDL looks like this: <s:element minOccurs="0" maxOccurs="1" name="docbinaryarray" type="s:base64Binary" /> He sends me base64binary and fails every time. All I get is Byte array with only one element inside.

    Read the article

  • Cast errors with IXmlSerializable

    - by Nathan
    I am trying to use the IXmlSerializable interface to deserialize an Object (I am using it because I need specific control over what gets deserialized and what does not. See my previous question for more information). However, I'm stumped as to the error message I get. Below is the error message I get (I have changed some names for clarity): An unhandled exception of type 'System.InvalidCastException' occurred in App.exe Additional information: Unable to cast object of type 'System.Xml.XmlNode[]' to type 'MyObject'. MyObject has all the correct methods defined for the interface, and regardless of what I put in the method body for ReadXml() I still get this error. It doesn't matter if it has my implementation code or if it's just blank. I did some googling and found an error that looks similar to this involving polymorphic objects that implement IXmlSerializable. However, my class does not inherit from any others (besides Object). I suspected this may be an issue because I never reference XmlNode any other time in my code. Microsoft describes a solution to the polymorphism error: https://connect.microsoft.com/VisualStudio/feedback/details/422577/incorrect-deserialization-of-polymorphic-type-that-implements-ixmlserializable?wa=wsignin1.0#details The code the error occurs at is as follows. The object to be read back in is an ArrayList of "MyObjects" IO::FileStream ^fs = gcnew IO::FileStream(filename, IO::FileMode::Open); array<System::Type^>^ extraTypes = gcnew array<System::Type^>(1); extraTypes[0] = MyObject::typeid; XmlSerializer ^xmlser = gcnew XmlSerializer(ArrayList::typeid, extraTypes); System::Object ^obj; obj = xmlser->Deserialize(fs); fs->Close(); ArrayList ^al = safe_cast<ArrayList^>(obj); MyObject ^objs; for each(objs in al) //Error occurs here { //do some processing } Thanks for reading and for any help.

    Read the article

  • How to Capture a live stream from Windows Media Server 2008

    - by Hummad Hassan
    I want to capture the live stream from windows media server to filesystem on my pc I have tried with my own media server with the following code. but when i have checked the out put file i have found this in it. FileStream fs = null; try { HttpWebRequest req = (HttpWebRequest)WebRequest.Create("http://mywmsserver/test"); CookieContainer ci = new CookieContainer(1000); req.Timeout = 60000; req.Method = "Get"; req.KeepAlive = true; req.MaximumAutomaticRedirections = 99; req.UseDefaultCredentials = true; req.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3"; req.ReadWriteTimeout = 90000000; req.CookieContainer = ci; //req.MediaType = "video/x-ms-asf"; req.AllowWriteStreamBuffering = true; HttpWebResponse resp = (HttpWebResponse)req.GetResponse(); Stream resps = resp.GetResponseStream(); fs = new FileStream("d:\\dump.wmv", FileMode.Create, FileAccess.ReadWrite); byte[] buffer = new byte[1024]; int bytesRead = 0; while ((bytesRead = resps.Read(buffer, 0, buffer.Length)) > 0) { fs.Write(buffer, 0, bytesRead); } } catch (Exception ex) { } finally { if (fs != null) fs.Close(); }

    Read the article

  • How to Capture a live stream from Windows Media Server 2008 using c#.net

    - by Hummad Hassan
    I want to capture the live stream from windows media server to filesystem on my pc I have tried with my own media server with the following code. but when i have checked the out put file i have found this in it. please help me with this. Thanks [Reference] Ref1=http://mywindowsmediaserver/test?MSWMExt=.asf Ref2=http://mywindowsmediaserver/test?MSWMExt=.asf FileStream fs = null; try { HttpWebRequest req = (HttpWebRequest)WebRequest.Create("http://mywmsserver/test"); CookieContainer ci = new CookieContainer(1000); req.Timeout = 60000; req.Method = "Get"; req.KeepAlive = true; req.MaximumAutomaticRedirections = 99; req.UseDefaultCredentials = true; req.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3"; req.ReadWriteTimeout = 90000000; req.CookieContainer = ci; //req.MediaType = "video/x-ms-asf"; req.AllowWriteStreamBuffering = true; HttpWebResponse resp = (HttpWebResponse)req.GetResponse(); Stream resps = resp.GetResponseStream(); fs = new FileStream("d:\\dump.wmv", FileMode.Create, FileAccess.ReadWrite); byte[] buffer = new byte[1024]; int bytesRead = 0; while ((bytesRead = resps.Read(buffer, 0, buffer.Length)) > 0) { fs.Write(buffer, 0, bytesRead); } } catch (Exception ex) { } finally { if (fs != null) fs.Close(); }

    Read the article

  • Is it OK to use WPF assemblies in a web app?

    - by Chris
    I have an ASP.NET MVC 2 app targeting .NET 4 that needs to be able to resize images on the fly and write them to the response. I have code that does this and it works. I am using System.Drawing.dll. However, I want to enhance my code so that not only am I resizing the image, but I am dropping it from 24bpp down to 4bit grayscale. I could not, for the life of me, find code on how to do this with System.Drawing.dll. But I did find a bunch of WPF stuff. This is my working/sample code (runs in LinqPad). // Load the original 24 bit image var bitmapImage = new BitmapImage(); bitmapImage.BeginInit(); bitmapImage.UriSource = new Uri(@"C:\Temp\Resized\18_appa2_015.png", UriKind.Absolute); //bitmapImage.DecodePixelWidth = 600; bitmapImage.EndInit(); // Create the destination image var formatConvertedBitmap = new FormatConvertedBitmap(); formatConvertedBitmap.BeginInit(); formatConvertedBitmap.Source = bitmapImage; formatConvertedBitmap.DestinationFormat = PixelFormats.Gray4; formatConvertedBitmap.EndInit(); // Encode and dump the image to disk var encoder = new PngBitmapEncoder(); encoder.Frames.Add(BitmapFrame.Create(formatConvertedBitmap)); using (var fileStream = File.Create(@"C:\Temp\Resized\18_appa2_015_s2.png")) { encoder.Save(fileStream); } It uses System.Xaml.dll, WindowsBase.dll, PresentationCore.dll, and PresentationFramework.dll. The namespaces used are: System.Windows.Controls, System.Windows.Media, and System.Windows.Media.Imaging. Is there any problem using these namespaces in my web application? It doesn't seem right. If anyone knows how to drop the bit depth without all this WPF stuff (which I barely understand, BTW) I would be thrilled to see that too.

    Read the article

  • Access denied for pdf to read using itextsharp at server level..

    - by apekshabs
    Am facing error after uploadind to server... as access denied .... can anyone help me.... Document myDocument = new Document(PageSize.A5, 26, 72, 180, 180); string strUniqueFn = "onlineinvoice.pdf"; string imgpath = "logo.gif"; string strUser = Thread.CurrentPrincipal.Identity.Name.Substring(Thread.CurrentPrincipal.Identity.Name.IndexOf("\\") + 1).ToUpper(); string strFolder = Server.MapPath("."); System.IO.DirectoryInfo di = new System.IO.DirectoryInfo(strFolder); System.IO.FileInfo[] fi = di.GetFiles(strUser + "*.*"); for (i = 0; i <= fi.Length - 1; i++) { System.IO.File.Delete(strFolder + "\\" + strUniqueFn); } string strPath = strFolder + "\\" + strUniqueFn; PdfWriter pdfw = PdfWriter.GetInstance(myDocument, new FileStream(strPath, FileMode.Create)); string iPath = strFolder + "\\" + imgpath; pdfw.CloseStream = false; myDocument.Open(); ...................... myDocument.Close(); Am facing error at PdfWriter pdfw = PdfWriter.GetInstance(myDocument, new FileStream(strPath, FileMode.Create)); can anyone help me... Thank you

    Read the article

  • reading non-english html pages with c#

    - by Gal Miller
    I am trying to find a string in Hebrew in a website. The reading code is attached. Afterward I try to read the file using streamReader but I can't match strings in other languages. what am I suppose to do? // used on each read operation byte[] buf = new byte[8192]; // prepare the web page we will be asking for HttpWebRequest request = (HttpWebRequest) WebRequest.Create("http://www.webPage.co.il"); // execute the request HttpWebResponse response = (HttpWebResponse) request.GetResponse(); // we will read data via the response stream Stream resStream = response.GetResponseStream(); string tempString = null; int count = 0; FileStream fileDump = new FileStream(@"c:\dump.txt", FileMode.Create); do { count = resStream.Read(buf, 0, buf.Length); fileDump.Write(buf, 0, buf.Length); } while (count > 0); // any more data to read? fileDump.Close();

    Read the article

  • Are finalizers ever allowed to call other managed classes' methods?

    - by romkyns
    I used to be pretty sure the answer is "no", as explained in Overriding the Finalize method and Object.Finalize documentation. However, while randomly browsing through FileStream in Reflector, I found that it can actually call just such a method from a finalizer: private SafeFileHandle _handle; ~FileStream() { if (this._handle != null) { this.Dispose(false); } } protected override void Dispose(bool disposing) { try { ... } finally { if ((this._handle != null) && !this._handle.IsClosed) // <=== HERE { this._handle.Dispose(); // <=== AND HERE } [...] } } I started wondering whether this will always work due to the exact way in which it's written, and hence whether the "do not touch managed classes from finalizers" is just a guideline that can be broken given a good reason and the necessary knowledge to do it right. I dug a bit deeper and found out that the worst that can happen when the "rule" is broken is that the managed object being accessed had already been finalized, or may be getting finalized in parallel on a separate thread. So if the SafeFileHandle's finalizer didn't do anything that would cause a subsequent call to Dispose fail then the above should be fine... right? Question: so there might after all be situations in which a method on another managed class may be called reliably from a finalizer? I've always believed this to be false, but this code suggests that it's possible and that there can be good enough reasons to do it. Bonus: Observe that the SafeFileHandle will not even know it's being called from a finalizer, since this is just a normal call to Dispose(). The base class, SafeHandle, actually has two private methods, InternalDispose and InternalFinalize, and in this case InternalDispose will be called. Isn't this a problem? Why not?...

    Read the article

  • Looping through a file in VB.NET

    - by Ousman
    I am writing a VB.NET program and I'm trying to accomplish the following: Read and loop through a text file line by line Show the event of the loop on a textbox or label until a button is pressed The loop will then stop on any number that happened to be at the loop event and When a button is pressed again the loop will continue. Code Imports System.IO Public Class Form1 'Dim nFileNum As Integer = FreeFile() ' Get a free file number Dim strFileName As String = "C:\scb.txt" Dim objFilename As FileStream = New FileStream(strFileName, _ FileMode.Open, FileAccess.Read, FileShare.Read) Dim objFileRead As StreamReader = New StreamReader(objFilename) 'Dim lLineCount As Long 'Dim sNextLine As String Private Sub btStart_Click(ByVal sender As System.Object, _ ByVal e As System.EventArgs) _ Handles btStart.Click Try If objFileRead.ReadLine = Nothing Then MsgBox("No Accounts Available to show!", _ MsgBoxStyle.Information, _ MsgBoxStyle.DefaultButton2 = MsgBoxStyle.OkOnly) Return Else Do While (objFileRead.Peek() > -1) Loop lblAccounts.Text = objFileRead.ReadLine() 'objFileRead.Close() 'objFilename.Close() End If Catch ex As Exception MessageBox.Show(ex.Message) Finally 'objFileRead.Close() 'objFilename.Close() End Try End Sub Private Sub Form1_Load(ByVal sender As System.Object, _ ByVal e As System.EventArgs) _ Handles MyBase.Load End Sub End Class Problem I'm able to read the text file but my label will only loop if I hit the start button. It goes to the next line, but I want it to continue to loop through the entire file until I hit a button telling it to stop.

    Read the article

  • File Locked by Services (after service code reading the text file)

    - by rvpals
    I have a windows services written in C# .NET. The service is running on a internal timer, every time the interval hits, it will go and try to read this log file into a String. My issue is every time the log file is read, the service seem to lock the log file. The lock on that log file will continue until I stop the windows service. At the same time the service is checking the log file, the same log file needs to be continuously updated by another program. If the file lock is on, the other program could not update the log file. Here is the code I use to read the text log file. private string ReadtextFile(string filename) { string res = ""; try { System.IO.FileStream fs = new System.IO.FileStream(filename, System.IO.FileMode.Open, System.IO.FileAccess.Read); System.IO.StreamReader sr = new System.IO.StreamReader(fs); res = sr.ReadToEnd(); sr.Close(); fs.Close(); } catch (System.Exception ex) { HandleEx(ex); } return res; } Thank you.

    Read the article

  • Issue while saving image using savefiledialog

    - by user1097772
    I'm using savefiledialog to save an image. Canvas is picturebox and the loaded image is bitmap. When I try to save it the file is created but somehow corrupted. Cause when I try againt load the image or show in different viewer it doesn't work - I mean the saved file is corrupted. There is an method for saving image. private void saveFileDialog1_FileOk(object sender, CancelEventArgs e) { System.IO.FileStream fs = (System.IO.FileStream)saveFileDialog1.OpenFile(); try { switch (saveFileDialog1.FilterIndex) { case 1: canvas.Image.Save(saveFileDialog1.FileName, System.Drawing.Imaging.ImageFormat.Bmp); break; case 2: canvas.Image.Save(saveFileDialog1.FileName, System.Drawing.Imaging.ImageFormat.Jpeg); break; case 3: canvas.Image.Save(saveFileDialog1.FileName, System.Drawing.Imaging.ImageFormat.Png); break; case 4: canvas.Image.Save(saveFileDialog1.FileName, System.Drawing.Imaging.ImageFormat.Tiff); break; } } catch (Exception ex) { System.Console.WriteLine("Exception " + ex); } I should also mention the property Filter. saveFileDialog1.Filter has value: bmp (*.bmp)|*.bmp|jpeg (*.jpeg)|*.jpeg|png (*.png)|*.png|tiff (*.tiff)|*.tiff

    Read the article

  • How to play simultaneous multiply audio sources in Silverlight

    - by Shurup
    I want to play simultaneous multiply audio sources in Silverlight. So I've created a prototype in Silverlight 4 that should play a two mp3 files containing the same ticks sound with an intervall 1 second. So these files must be sounded as one sound if they will be played together with any whole second offsets (0 and 1, 0 and 2, 1 and 1 seconds, etc.) I my prototype I use two MediaElement (me and me2) objects. DateTime startTime; private void Play_Clicked(object sender, RoutedEventArgs e) { me.SetSource(new FileStream(file1), FileMode.Open))); me2.SetSource(new FileStream(file2), FileMode.Open))); var timer = new DispatcherTimer { Interval = TimeSpan.FromMilliseconds(1) }; timer.Tick += RefreshData; timer.Start(); } First file should be played at 00:00 sec. and the second in 00:02 second. void RefreshData(object sender, EventArgs e) { if(me.CurrentState != MediaElementState.Playing) { startTime = DateTime.Now; me.Play(); return; } var elapsed = DateTime.Now - startTime; if(me2.CurrentState != MediaElementState.Playing && elapsed >= TimeSpan.FromSeconds(2)) { me2.Play(); ((DispatcherTimer)sender).Stop(); } } The tracks played every time different and not simultaneous as they should (as one sound).

    Read the article

  • SqlServer slow on production environment

    - by Lieven Cardoen
    I have a weird problem in a production environment at a customer. I can't give any details on the infrastructure except that sql server runs on a virtual server and the data, log and filestream file are on another storage server (data and filestream together and log on a seperate server). Now, there's this query that when we run it gives these durations (first we clear the cache): 300ms, 20ms, 15ms, 17ms,... First time it takes longer, but from then on it is cached. At the customer, on a sql server that is more powerfull, these are the durations (I didn't have the rights to clear the cache. Will try this tomorrow). 2500ms, 2600ms, 2400ms, ... The query can be improved, that's right, but that's not the question here. How would you tackle this? I don't know where to go from here. The servers at this customer are really more powerfull but they do have virtual servers (we don't). What could be the cause... - Not enough memory? - Fragmentation? - Physical storage? I know it can be a lot of things, but maybe some of you have got some info for me on how to go on with this issue...

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >