Search Results

Search found 47251 results on 1891 pages for 'web storage'.

Page 931/1891 | < Previous Page | 927 928 929 930 931 932 933 934 935 936 937 938  | Next Page >

  • Can someone correct me about Windows 2008 DFS implementation

    - by cwheeler33
    I have 2 two Windows 2003 file servers using DoubleTake at the moment. They are in an 2003 AD domain. And each server has it's own disk set. It is time to replace the servers... I want to use Windows 2008 64bit Ent. I was thinking of using DFS-R to replace DoubleTake. The part I'm not sure about is clustering. Do I need to have shared storage if each server has a copy of the data? I want to have the data available to the same share name, so maybe I don't fully understand how DFS is set up. I currently have about 6TB of data. I expect to grow by 3TB a year on these file servers. Any resources/books that could teach me would be good to know as well.

    Read the article

  • [ASP.NET] IIS7 downloading file length

    - by GTD
    I've following code for file download: FileInfo fileInfo = new FileInfo(filePath); context.Response.Clear(); context.Response.ContentType = "application/octet-stream"; context.Response.AddHeader("Content-Disposition", "attachment; filename=" + System.IO.Path.GetFileName(filePath)); context.Response.AddHeader("Content-Length", fileInfo.Length.ToString()); context.Response.WriteFile(filePath); context.Response.End(); When I run it on my local IIS6 it works fine. Web browser (tested on IE8, Firefox 3.5.2, Opera 10) shows file length before I start download the file. When I run this code on remote IIS7, web browser doesn't shows file length. File length is unknown. Why I don't get file length when this code runs under IIS7?

    Read the article

  • How to maxmise the largest contiguous block of memory in the Large Object Heap

    - by Unsliced
    The situation is that I am making a WCF call to a remote server which is returns an XML document as a string. Most of the time this return value is a few K, sometimes a few dozen K, very occasionally a few hundred K, but very rarely it could be several megabytes (first problem is that there is no way for me to know). It's these rare occasions that are causing grief. I get a stack trace that starts: System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown. at System.Xml.BufferBuilder.AddBuffer() at System.Xml.BufferBuilder.AppendHelper(Char* pSource, Int32 count) at System.Xml.BufferBuilder.Append(Char[] value, Int32 start, Int32 count) at System.Xml.XmlTextReaderImpl.ParseText() at System.Xml.XmlTextReaderImpl.ParseElementContent() at System.Xml.XmlTextReaderImpl.Read() at System.Xml.XmlTextReader.Read() at System.Xml.XmlReader.ReadElementString() at Microsoft.Xml.Serialization.GeneratedAssembly.XmlSerializationReaderMDRQuery.Read2_getMarketDataResponse() at Microsoft.Xml.Serialization.GeneratedAssembly.ArrayOfObjectSerializer2.Deserialize(XmlSerializationReader reader) at System.Xml.Serialization.XmlSerializer.Deserialize(XmlReader xmlReader, String encodingStyle, XmlDeserializationEvents events) at System.Xml.Serialization.XmlSerializer.Deserialize(XmlReader xmlReader, String encodingStyle) at System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall) at System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters) I've read around and it is because the Large Object Heap is just getting too fragmented, so even preceding the call with a quick check to StringBuilder.EnsureCapacity just causes the OutOfMemoryException to be thrown earlier (and because I'm guessing at what's needed, it might not actually need that much so my check is causing more problems than it is solving). Some opinions are that there's not much I can do about it. Some of the questions I've asked myself: Use less memory - have you checked for leaks? Yes. The memory usage goes up and down, but there's no fundamental growth that guarantees this to happen. Some of the times it fails, it succeeded at that stage previously. Transfer smaller amounts Not an option, this is a third party web service over which I have no control (or at least it would take a long time to resolve, in the meantime I still have a problem) Can you do something to the LOH to make it less likely to fail? ... now this is most fruitful course. It's a 32-bit process (it has to be for various political, technical and boring reasons) but there's normally hundreds of meg free (multiples of the largest amount for which we've seen failures). Can we monitor the LOH? Using perfmon I can track the size of the heaps, but I don't think there's a way to monitor the largest available contiguous block of memory. Question is: any advice or suggestions for things to try?

    Read the article

  • Creating BlackBerry method stubs using wscompile on WSDL from ColdFusion

    - by Jim B
    I have been working on a BlackBerry application that consumes web services from ColdFusion 7. The Java ME SDK and the Java Wireless Toolkit both require that the generated WSDL be of the document/literal type. Fortunately, I have input on the web service development so I tried setting 'style="document"' in the cfcomponent tag. This generated a document/literal style WSDL but now wscompile generates the following errors in several places: Found unknown simple type: javax.xml.soap.SOAPElement Found unknown simple type: java.util.Calendar Any ideas why this is happening? The WSDL does get parsed correctly by the JWSDP tool but the stubs use namespaces that are not available in the J2ME platform. I would have thought ColdFusion WSDL would work more easily with other products in the Java family.

    Read the article

  • Tutorials/Books on using Mono to develop RESTful webservices?

    - by max
    Hi, anyone out there got any pointers to good links/tutorials/books on developing webservices with Mono? In more detail, I am interested in using Mono from project start on a Linux host developing in C# using Visual Studio for development, ideally with remote debugging if that is realistic developing web-services in MONO accessible in a RESTful manner, returning JSON hiding the services processes behind an Apache access the services either via javascript/AJAX or from a thin script layer written in PHP scalability is important for me unit-testing of webservices Any recommendations for material I could sift through to get a good head-start? I might add that I'm C#/.NET savvy, but not in the context of web development. I've been using it since it came out, but mainly for internal server-client applications where the clients were Windows desktop apps and the communication layer was remoting or, sometimes, more low-level socket-based. Thanks, max

    Read the article

  • Most useful techniques for two way data binding with js

    - by Perpetualcoder
    With abundance of web services and client side templating features of jQuery and likes, creating mashups or sites consuming a multitude of web services and posting data back to these services is becoming exceedingly popular. For a page of decent size with this kind of architecture, say a dashboard. What are the useful techniques of maintaining this client side state. In other words whats are some of the ways to do two way databinding? Sample scenario: Get Data From Service as JSON/XML Display/Bind on UI Capture User Input Recreate request from the UI controls/html Post Data To Service Get Response and Rebind

    Read the article

  • MySQL keeps crashing OS server.. Please help adjust my.ini!

    - by TruMan1
    I have MySQL 5.0 installed on a Windows 2008 machine (3GB RAM). My server crashes on a regular basis (almost once a day) with this error: Changed limits: max_open_files: 2048 max_connections: 800 table_cache: 619 I did not use the heavy InnoDB .ini file, although I am rethinking that I should have? I am worried that big configuration changes will make my current sites stop working. What should I do? Here is my current ini settings: default-character-set=latin1 default-storage-engine=INNODB max_connections=800 query_cache_size=84M table_cache=1520 tmp_table_size=30M thread_cache_size=38 myisam_max_sort_file_size=100G myisam_sort_buffer_size=30M key_buffer_size=129M read_buffer_size=64K read_rnd_buffer_size=256K sort_buffer_size=256K innodb_additional_mem_pool_size=6M innodb_flush_log_at_trx_commit=1 innodb_log_buffer_size=3M innodb_buffer_pool_size=250M innodb_log_file_size=50M innodb_thread_concurrency=10

    Read the article

  • How to upload file via the JSON interface in Rails

    - by akafazov
    Hi, I have a web service in Rails which among all else should provide file upload functionality to the clients. The clients all use JSON to talk to the webservice. I use the Paperclip plugin for upload management. The problem is I do not know how to upload a file via JSON. All works in the web formular, but I cannot find information on how to consctruct my JSON Request to send files to the server. Can somebody help out? Regards, Angel Kafazov

    Read the article

  • What if I have an API method and a contoller/view method with the same name in RoR?

    - by Chad Johnson
    Suppose I want to be able to view a list of products on my site by going to /product/list. Great. So this uses my 'list' view and outputs some HTML which my web browser will render. But now suppose I want to provide a REST API to my client where they can get a list of their products. So I suppose I'd have them authenticate with oAuth and then they'd call /product/list which would return a JSON array of their products. But like I said earlier, /product/list displays an HTML web page. So, I have a conflict. What is normal practice as far as providing APIs in Rails? Should I have a subdirectory, 'api', in /app/controller, and another 'product' controller? So my client would go to /api/product/list to get a list of their products? I'm a bit new to RoR, so I don't have the best grasp of the REST functionality yet, but hopefully my question makes sense.

    Read the article

  • Calculating IOPS for a single HDD - what am I doing wrong?

    - by red888
    So I know there is no standardized way of calculating IOPS for a HDD, but from everything I have read it appears one of the most accurate formulas is the following: IOP/ms = + {rotational latency} + ({block size} / {data transfer rate}) Which is IOs per millisecond or what the book I've been reading calls "Disk Service Time". Also rotational latency is calculated as half of one rotation in milliseconds. This was taken from the EMC book "Information Storage and Management" -arguably a pretty reliable source right\wrong? Putting this formula into practice consider this Seagate data sheet. I am going to calculate IOPS for the ST3000DM001 model for a block size of 4kb: Seek Average (Write) = 9.5 -I'll measuring IOPS for writes Spindle speed = 7200rpm Average Data Rate = 156MB/s So my variables are: Seek Time = 9.5ms Rotational latency = (.5 / (7200rpm / 60)) = 0.004s = 4ms Data Rate = 156MB/s = (0.156MB/ms / 0.004MB) = 39 9.5ms + 4ms + 39 = IO/ms 52.5 1 / (52.5 * 0.001) = 19 IOPS 19 IOPS for this drive clearly is not right so what am I doing wrong?

    Read the article

  • Linear Performance Scalability with HP San Solutions

    - by Berzemus
    Hi all, I need a San Solution with linear scalability in size as well as in performance. From what I know, with a Modular Smart Array solution such as the P2000/MSA-class solutions from HP, even with a dual controller initial node, I can only increase the size of it, as added nodes come controller-less, so overall performance tends to decrease. On the other hand, the P4000 (lefthand) family of solutions has each of it's nodes have it's own controller, and so when a node is added, storage capacity as well as performance increase. Am I right in all that I say, and is the P4000 the only solution, or have I forgotten something ?

    Read the article

  • Apache Axis2 tried books

    - by josh
    I am looking to get my hands wet in Java Web Services. Apache CXF and Axis2 seem to be in demand a lot. I googled a bit for finding a good Axis2 book that has lot of working examples. some books like Quick Start Apache Axis 2 seem to have bad reviews on Amazon and titles like Web Services with Apache CXF and Axis2 seem to lack working examples. I want to get opinion from people who currently work with Axis2 and have read a book or two on the subject. Which is a good book for an Axis2 n00b.

    Read the article

  • SQL 2008 R2: Data\Log partions

    - by Reese Hirth
    I have a SQL Server setup that a previous IT person set up with a 2TB data partition and a 1TB log partition. The OS partition is 244GB and SQL is installed on a separate 1TB partition. We have an additional 8TB of storage that I would like the new IT staff to bring on line. He wants to create 4 new 2TB data partition. I see this as confusing. Can't we just backup the current data partition, blow it away, and create a new 10TB data partition I'm responsible for administering the data on the server but am not allowed to do the setup myself. This is a GIS server running ArcGIS Server with around 60 geodatabases ranging from 20BG to a couple that may grow to over a TB. So, 5-2TB data partitions or 1-10TB partition. Thanks for the advice.

    Read the article

  • Best way to replicate servers

    - by Matthew
    I currently have two servers both with linux software RAID1 configurations. They use heartbeat and DRBD to create a shared DRBD device that hosts a a exported NFS directory. The servers run Ubuntu Server with a LXDE GUI and some IP These servers are going to be placed on fishing vessels to act has redundant storage for IP cameras. My boss wants me to figure out the most efficient way to create these servers. We might be looking at pushing out several systems a week. Each configuration will be almost identical besides IP addressing. What would be the best method to automate the configuration process? We are trying to cut down on labor costs to set these up. Imaging and Proceeding are both on my mind right now

    Read the article

  • Huge amount of time sending data with suds and proxy

    - by Roman
    Hi everyone, I have the following code to send data through a proxy using suds: import suds t = suds.transport.http.HttpTransport() proxy = urllib2.ProxyHandler({'http': 'http://192.168.3.217:3128'}) opener = urllib2.build_opener(proxy) t.urlopener = opener ws = suds.client.Client('http://xxxxxxx/web.asmx?WSDL', transport=t) req = ws.factory.create('ActionRequest.request') req.SerialNumber = 'asdf' req.HostName = 'hola' res = ws.service.ActionRequest(req) I don't know why, but it can be sending data above 2 or 3 minutes, or even more and it raises a "Gateway timeout" exception sometimes. If I don't use the proxy, the amount of time used is above 2 seconds or less. Here is the SOAP reply: (ActionResponse){ Id = None Action = "Action.None" Objects = "" } The proxy is running right with other requests through urllib2, or using normal web browsers like firefox. Does anyone have any idea what's happening here with suds? Thanks a lot in advance!!!

    Read the article

  • Problem in SharePoint Object model when accessing the sharepoint list items?

    - by JanardhanReddy
    just i wrote using (SPSite site = SPContext.Current.Site) { using (SPWeb web = site.OpenWeb()) { //SPList lst = web.Lists["ManagerInfo"]; SPList lst = web.Lists[strlist]; SPQuery getUserNameQuery = new SPQuery(); // getUserNameQuery.Query = "<Where><And><Eq><FieldRef Name=\"Region\" /><Value Type=\"Text\">" + strRegion + "</Value></Eq><And><Eq><FieldRef Name=\"PM_x0020_First_x0020_Name\" /><Value Type=\"Text\">" + pmFName + "</Value></Eq><Eq><FieldRef Name=\"PM_x0020_Last_x0020_Name\" /><Value Type=\"Text\">" + pmLname + "</Value></Eq></And></And></Where>"; // getUserNameQuery.Query = "<Where><And><Eq><FieldRef Name=\"PM_x0020_First_x0020_Name\" /><Value Type=\"Text\">" + pmFName + "</Value></Eq><Eq><FieldRef Name=\"PM_x0020_Last_x0020_Name\" /><Value Type=\"Text\">" + pmLname + "</Value></Eq></And></Where>"; getUserNameQuery.Query = "<Where><Eq><FieldRef Name=\"PM_x0020_Name\" /><Value Type=\"Text\">" + loginName + "</Value></Eq></Where>"; SPListItemCollection items = lst.GetItems(getUserNameQuery); foreach (SPListItem item in items) { managerFName = item["Manager Name"].ToString(); strAccounting = item["Accounting"].ToString(); managerFName = managerFName.Replace(".", " "); strAccounting = strAccounting.Replace(".", " "); // isFound = true; XPathNavigator managerName = MainDataSource.CreateNavigator().SelectSingleNode("/my:myFields/my:txtManagerName", NamespaceManager); managerName.SetValue(managerFName); XPathNavigator accountingName = MainDataSource.CreateNavigator().SelectSingleNode("/my:myFields/my:txtAccountingName", NamespaceManager); accountingName.SetValue(strAccounting); } } } i used this code in infopath this infopath is using by all users.os when the current login user have no permissions to the list it showing error.when the current login user have full Permission it is working. So Please advise me what can i do inorder to work them for all users.

    Read the article

  • Un-readable files uploaded via PHP FTP functions

    - by Mike
    I just setup a LAMP development server and am still trouble-shooting some things. The server is installed on one computer and I use a Windows laptop to write my code and test the site via the web browser. My file uploading script works in that JPEG image files are successfully uploaded to the server, but when I try to view the images in the web browser, permission is denied. I check the permissions on the file via the server and they are 600. I can fix the issue by chmod 777 theimage.jpg, but this doesn't seem like a good solution at all. Does the solution have something to do with Apache configuration? Or is there something else I should be doing. Thank-you, Mike

    Read the article

  • FreeNAS - how to "Exclude from file" in Rsyncd (GUI)

    - by user179181
    I am trying to set rsync tasks to Pull user profiles from 11 Windows machines running DeltaCopy Server and then configure ZFS periodic snapshot tasks for a backup solution. So far this has been working fine, although i would like to exclude certain file types like .DAT or NTUSER.DAT. My Exclusion file resides on the local ZFS Dataset (Receiving side) and is as follows: Temp Temporary Internet Files NTUSER.DAT NTUSER.DAT.LOG *.dat *.tmp *.DAT.log *.ost *.pst The command i typed under Auxiliary Parameters (Rsyncd Global Conf under services)is as follows: exclude from = /mnt/Storage/User_Profiles/exclude.txt Ive tried deleting the .DAT files from the receiving end and just as i start to get excited i click refresh and there they are again

    Read the article

  • 'client_errors' warning in output from cron job that runs PHP on GoDaddy

    - by Kevin
    Hi. I have several cron jobs that I've set up on my GoDaddy hosting account through their control panel. The commands look like this: /web/cgi-bin/php5 "$HOME/html/myfolder/cron_do_stuff.php" The jobs runs as scheduled, and the scripts work perfectly, and the output from the scripts always gets sent to my email address. I would love a way to disable this (since the PHP script can send it's own emails if it's necessary). But my real question is about the output, which always contains this on the first line: /web/cgi-bin/php5: Symbol `client_errors' has different size in shared object, consider re-linking Looks like a server configuration error to me, and after talking with GoDaddy they said it's a benign warning and not to worry about it. I was just curious if anyone had ideas for fixing it.

    Read the article

  • ELMAH: Only sending specific exception type via mail

    - by Sir Code-A-Lot
    Hi, I have ELMAH set up for a webapp, logging exceptions to a SQL server. I wish to have ELMAH send me an email too, but only when a specific exception is thrown (ie. MySpecialException). ELMAH must still log all exceptions to SQL server. I know you can do it programmatically in global.asax, but I'd prefer to use web.config. So, how do I restrict ELMAH error mails to filter out everything but a specific exception type, using web.config?

    Read the article

  • Linux periodically "losing" ability to connect to server via SSH?

    - by gct
    I know this isn't exactly a programming question, but it popped up in my use of git for programming projects at least. I've got a web server that I use to host my git repos on, but my ubuntu box seems to "lose" the ability to connect to it via SSH. I'll get a "connection refused" error when I try to ssh or use git. Rebooting my local machine will fix the problem, but only temporarily. I can still connect to the web interface just fine, and the problem manifests with other servers as well. I've been working around it by pulling my changes over to my laptop and pushing from there, but that's sub-optimal as you can imagine. Has anyone seen something like this? I'd be tempted to say it's some kind of IP caching problem, but I can't connect even using the IP address of the server directly... Running Ubuntu 9.04

    Read the article

  • Secure messaging using Secure MIME is it reliable?

    - by aaronb
    We have an automatic reporting and notification system written in .net that sends emails with plain text. We are having to encrypt the messages that we send our clients. The possible implementation approaches we have: Send messages as S/Mime email with attachments. Plain text email with that just contains a link to a web site that will display the message over https. It seems like S/Mime is a simpler solution, as we won't need to create the web application or secure it. Our concern is our interoperability with our clients email clients and more importantly their email filtering software. Has anyone had success or issues deploying a Secure MIME messaging solution?

    Read the article

  • Zero code coverage with cobertura 1.9.2 but tests are working

    - by eraonel
    I run the code coverage target: <junit fork="yes" dir="${basedir}" failureProperty="test.failed"> <!-- Note the classpath order: instrumented classes are before the original (uninstrumented) classes. This is important. --> <classpath path="${instrumented.dir}" /> <classpath path="${classes.dir}" /> <classpath refid="classpath" /> <!-- The instrumented classes reference classes used by the Cobertura runtime, so Cobertura and its dependencies must be on your classpath. --> <classpath refid="cobertura.classpath" /> <formatter type="xml" /> <!--<test name="${testcase}" todir="${reports.xml.dir}" if="testcase" />--> <batchtest fork="yes" todir="${reports.xml.dir}"> <fileset dir="${classes.dir}"> <include name="**/generated/AllTests.class" /> </fileset> </batchtest> </junit> <junitreport todir="${reports.xml.dir}"> <fileset dir="${reports.xml.dir}"> <include name="TEST-*.xml" /> </fileset> <report format="frames" todir="${reports.html.dir}" /> </junitreport> Then I get the following output ( when using fork="true"): java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:585) at net.sourceforge.cobertura.util.FileLocker.lock(FileLocker.java:124) at net.sourceforge.cobertura.coveragedata.ProjectData.saveGlobalProjectData(ProjectData.java:331) at net.sourceforge.cobertura.coveragedata.SaveTimer.run(SaveTimer.java:31) at java.lang.Thread.run(Thread.java:595) Caused by: java.io.IOException: No locks available at sun.nio.ch.FileChannelImpl.lock0(Native Method) at sun.nio.ch.FileChannelImpl.lock(FileChannelImpl.java:784) at java.nio.channels.FileChannel.lock(FileChannel.java:865) ... 8 more --------------------------------------- Unable to get lock on /vobs/rnc/rrt/roam2/roamSs/RoamMao_swb/RoamMao_bldu/ant_build/cobertura.ser.lock: null This is known to happen on Linux kernel 2.6.20. Make sure cobertura.jar is in the root classpath of the jvm process running the instrumented code. If the instrumented code is running in a web server, this means cobertura.jar should be in the web server's lib directory. Don't put multiple copies of cobertura.jar in different WEB-INF/lib directories. Only one classloader should load cobertura. It should be the root classloader. I am using Ant 1.7.0 and cobertura 1.9.2. Any ideas why there is no coverage? Test run ok as I see in my target. I have tried to switch java versions ( 1.5.0_06 and 1.6.0_10) but no difference.

    Read the article

< Previous Page | 927 928 929 930 931 932 933 934 935 936 937 938  | Next Page >