Search Results

Search found 85833 results on 3434 pages for 'general log file'.

Page 74/3434 | < Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >

  • Auto Log-Off Windows users - Windows 2003 domain

    - by thehatter
    I am trying to make windows clients automatically log off after some time, I have been trying to use the winexit.scr which I have seen working else where in a similar environment. After working though these instructions (I did read the comments and notice the original ADM provided is buggy) I've had no joy what so ever! Winexit.scr refuses to read any settings in the registry, even while using a test account I can access the required reg key(s); edit, add, and remove values. Essentially winexit.scr always uses it's default values: 30 second timeout, no forced log-out. What I really want is a 30 minute timeout with a forced log-out, closing all the users apps etc. I've tried removing and re-adding the ADM template, creating the GPO from scratch several times, giving various registry permissions - including full control to "Everybody" just for fun! Oh, clients are all win XP SP3, DC is win 2003 R2 SP2. So, can anybody suggest something? Cheers!

    Read the article

  • Tomcat log include servlet context

    - by Kris
    I have a Tomcat instance running several websites. Recently I've been trying to deal with the various error messages that wind up in the Tomcat log file (catalina.out). None of the issues are affecting the websites, but all the noise is making it difficult to see actual problems. My problem is that frequently the message is being emitted by a library that is used by multiple webapps. Unless a stacktrace is included (which it often isn't) I can't tell which webapp is responsible without a lot of digging. So the question is, can I somehow configure Tomcat to include the servlet context in the log file? Or perhaps have different log files per context?

    Read the article

  • Troubleshoot odd large transaction log backups...

    - by Tim
    I have a SQL Server 2005 SP2 system with a single database that is 42gigs in size. It is a modestly active database that sees on average 25 transactions per second. The database is configured in Full recovery model and we perform transaction log backups every hour. However it seems to be pretty random at some point during the day the log backup will go from it's average size of 15megs all the way up to 40gigs. There are only 4 jobs that are scheduled to run on the SQL server and they are all typical backup jobs which occur on a daily/weekly basis. I'm not entirely sure of what client activity takes place as the application servers are maintained by a different department. Is there any good way to track down the cause of these log file growths and pinpoint them to a particular application, or client? Thanks in advance.

    Read the article

  • Log Files from bash script output

    - by neildeadman
    I have a script that runs (this works fine). I'd like to produce logfiles from its output and still show it on screen. I have this command that creates three files from this blog: ((./fk.sh 2>&1 1>&3 | tee errors.log) 3>&1 1>&2 | tee output.log) 2>&1 | tee final.log This does exactly what I want it to. My only issue is that I create files in my script and copy them somewhere, and I'd like to copy these logfiles there too, which I can't do whilst this script is running. I also wanted to make it easier for any user to run my script, so I created another script to run this script. According to this post (see last post) I can put a . before the script name and I can use variables assigned in my called script from the first script if I use them in the first. It doesn't seem to work though and I can't figure out why or find alternative methods. Can anyone help?

    Read the article

  • SQL Server 2008 R2 Replication log reader could not execute sp_replcmds

    - by user49352
    This log reader agent worked perfectly for several months until the user referenced in the error was removed from the domain. After that time the error 'The process could not execute 'sp_replcmds' on 'SERVER'' was received with more detail 'Could not obtain information about Windows NT group/user' that referenced said user. This user was referenced nowhere in the the log reader agent other than the Publication Access List from which it was subsequenctly removed. The agent would still not successfully start up. The simple problem here, I believe, is that the log reader agent was created under that user and that no longer exists in the domain. Is there an 'owner' somewhere that needs to be changed? Every other function on the database continues to execute successfully. Any other help or thought would be appreciated.

    Read the article

  • RoboCopy Log Files missing whitespace?

    - by TwystNeko
    So I'm working on a script to use RoboCopy to copy a bunch of files, and log what it's copied. It works reasonably well, except for the logfiles. They tend to look like this: C:\Users\Tech\Documents\desktop.initechscan1.jpgtechscan2.jpgtechscan3.jpgtechscan4.jpgtechscan5.jpgwsus.jpgwsus2.jpgC:\Users\Tech\Documents\My Music\C:\Users\Tech\Documents\My Pictures\C:\Users\Tech\Documents\My Videos\C:\Users\Tech\Documents\My Digital Editions\ As you can see, the log seems to be missing all whitespace and separators between entries. Is there something I can do to fix this? It's kind of frustrating. The commandline I'm using is this: C:\Users\Tech\Desktop>Robocopy.exe C:\Users\Tech\Documents c:\Temp /e /l /b /xj /xf ntuser.* desktop.ini *.lnk /np /njh /log:migratedfiles.txt /v I have the /l in there since I'm debugging, and it's the easiest way to keep from copying everything a million times.

    Read the article

  • Du Meter Log file

    - by Jack
    Where can I find the Du Meter Log file? I tried searching C:\ProgramData\Hagel Technologies\DU Meter but the folder is empty. I also tried C:\Users\Username\AppData\Roaming and Local and LocalLow but none of them even have a Du Meter or Hagel Technologies folder. I even tried searching the temp folder but still nothing. I have a NetMeter.csv log file that I want to try and replace over the Du Meter log file cause I can't seem to find any other way to import data into Du Meter.

    Read the article

  • USER_LOGIN audit log with incorrect auid value?

    - by hijinx
    We have a CentOS 6.2 x86_64 system that's logging what looks to be erroneous audit information. We were receiving alerts for failed logins by a user who wasn't actually trying to log in. After some diagnosis, we figured out that the source of the events is our tool that periodically checks to see if SSH is answering. When that happens, we see this log this entry: type=USER_LOGIN msg=audit(1340312224.011:489216): user pid=28787 uid=0 auid=501 ses=8395 subj=unconfined_u:system_r:sshd_t:s0-s0:c0.c1023 msg='op=login acct=28756E6B6E6F776E207A01234567 exe="/usr/sbin/sshd" hostname=? addr=127.0.0.1 terminal=ssh res=failed' This is the entry we get whenever there is an incomplete ssh connection, but usually the auid is the same as the ses= value. For some reason, on this system, it's using a particular user's auid, regardless of the login user. For example, ssh'ng to this system as [email protected] and cancelling before providing a password generates this error. Attempting to log to an unrelated account with a bogus password will also create an entry with the incorrect auid value.

    Read the article

  • Web log files analyzer

    - by Peter Štibraný
    I already use Google Analytics on my page, but I'd like to get additional info from log files. I've looked at various packages during last days, but nothing impressed me so far. Some requirements: must work on log file level (I use apache combined logs, but can configure apache to produce other types of logs) can generate static reports (windows/linux) or use GUI (windows only) should be easy to add custom user agents, and rerun analysis if it can recognize installation of eclipse plugins from log, that would be big plus understands google serp position referer should not require two days to setup (awstats, I am looking at you) should be still under active developement (i.e. analog isn't good answer) preferrably free, or at not very expensive :-) Any good analyzers programs out there?

    Read the article

  • Log with iptalbes which user is delivering email to port 25

    - by Maus
    Because we got blacklisted on CBL I set up the following firewall rules with iptables: #!/bin/bash iptables -A OUTPUT -d 127.0.0.1 -p tcp -m tcp --dport 25 -j ACCEPT iptables -A OUTPUT -p tcp -m tcp --dport 25 -m owner --gid-owner mail -j ACCEPT iptables -A OUTPUT -p tcp -m tcp --dport 25 -m owner --uid-owner root -j ACCEPT iptables -A OUTPUT -p tcp -m tcp --dport 25 -m owner --uid-owner Debian-exim -j ACCEPT iptables -A OUTPUT -p tcp -m limit --limit 15/minute -m tcp --dport 25 -j LOG --log-prefix "LOCAL_DROPPED_SPAM" iptables -A OUTPUT -p tcp -m tcp --dport 25 -j REJECT --reject-with icmp-port-unreachable I'm not able to connect to port 25 from localhost with another user than root or a mail group member - So it seems to work. Still some questions remain: How effective do you rate this rule-set to prevent spam coming from bad PHP-Scripts hosted on the server? Is there a way to block port 25 and 587 within the same statement? Is the usage of /usr/sbin/sendmail also limited or blocked by this rule-set? Is there a way to log the username of all other attempts which try to deliver stuff to port 25?

    Read the article

  • Is there such a thing as a file hosted container which deduplicates data held within?

    - by Mallow
    Background I have backups of a website which stores all of it's data into a single file. This file is several gigs large and I have many different backups of this file. Most of the data within is mostly the same plus whatever was added or changed to it. I want to keep all the concurrent backups I've made through the years in case I find a horrible surprise of data corruption along the line. However storing a 10gig file every month gets expensive. Seeking Solution I've often thought about different ways of alleviating this problem. One thought that comes up very often combines the idea of a duplicating file system which doesn't require it's own partitioned volume on a hard drive. Something like what truecrypt does, what it calls, "file hosted containers" which when using the truecrypt program allows you to mount and dismount that volume as a regular hard drive. Question Is there a virtual hard drive mounter which uses file-based container which uses data deduplicaiton file system? (This question is a little awkward to put into words, if you have a better idea on how to ask this question please feel free to help out.)

    Read the article

  • Restricting server.log file size (minecraft) in CentOS

    - by MisdartedPenguin
    I'm currently running a bukkit (minecraft) server which generates a server.log file with all the console messages / errors. Every now and then I have a plugin (which i need) that crashes and can cause the server.log file size to increase dramatically. I've had it hit 32GB before which used all my disk space. Is there a way to make it a rolling log (deletes old errors) or be able to limit the file size so it can't go above say 10MB. But the solution needs to not affect how the server runs so it doesn't throw an error when it can't write anymore. Anyway of doing this with CentOS?

    Read the article

  • Turn "log slow queries" ON.

    - by CodedK
    Hello, I'm trying to log mysql slow queries, but I can't turn it on. I will explain all my steps: Open and Edit my.cnf and add the following lines: long_query_time = 5 slow_query_log_file = /myfolder/slowq.log log_slow_queries = 1 =(I have MySQL 5.0.7) Give mysql user permitions to write on the file: chown -R mysql:mysql /var/lib/mysql Create the file: touch /myfolder/slowq.log Chmod for this file to 777. service mysqld restart From MySQL Admin Panel I can see that the "log_slow_queries" var is OFF! Also no logs are created. Thanks in advance! Best Regards, Panos.

    Read the article

  • Here's your chance: MOS Feedback Sessions @OOW

    - by cwarticki
    Bring your questions, comments, concerns, opinions, recommendations, enhancement requests and any emotional outbursts!   As I travel the world and speak to thousands of customers, I receive plenty of feedback about My Oracle support.  Come hear directly from the source. Meet Dennis Reno, VP of Customer Portal Experience. The Customer Portal Experience team will host a My Oracle Support Tips and Techniques session and three roundtable feedback sessions at this year’s Oracle OpenWorld. The sessions will include a Hardware Support component, as well as best practices that are sure to benefit all My Oracle Support users. The events planned will give our users the opportunity to learn more about how the My Oracle Support customer portal adds value to the support process and to their business needs. The roundtable feedback sessions will allow customers to meet, give feedback, and share their experiences directly with the team responsible for the customer portal experience. Date Time (PT) Session Name Mon, Oct 1 01:45 PM My Oracle Support: Tips and Techniques for Getting the Best Hardware Support Possible (Session #CON9745) Tue, Oct 2 11:00 AM Roundtable - My Oracle Support General Feedback Wed, Oct 3 11:00 AM Roundtable - My Oracle Support Community Feedback Thr, Oct 4 11:00 AM Roundtable - My Oracle Support General Feedback Customers can find more information, including specific details about how to attend, by accessing My Oracle Support at OpenWorld (Article ID 1484508.1). Enjoy OpenWorld everyone! -Chris Warticki Global Customer Management

    Read the article

  • Adding complexity by generalising: how far should you go?

    - by marcog
    Reference question: http://stackoverflow.com/questions/4303813/help-with-interview-question The above question asked to solve a problem for an NxN matrix. While there was an easy solution, I gave a more general solution to solve the more general problem for an NxM matrix. A handful of people commented that this generalisation was bad because it made the solution more complex. One such comment is voted +8. Putting aside the hard-to-explain voting effects on SO, there are two types of complexity to be considered here: Runtime complexity, i.e. how fast does the code run Code complexity, i.e. how difficult is the code to read and understand The question of runtime complexity is something that requires a better understanding of the input data today and what it might look like in the future, taking the various growth factors into account where necessary. The question of code complexity is the one I'm interested in here. By generalising the solution, we avoid having to rewrite it in the event that the constraints change. However, at the same time it can often result in complicating the code. In the reference question, the code for NxN is easy to understand for any competent programmer, but the NxM case (unless documented well) could easily confuse someone coming across the code for the first time. So, my question is this: Where should you draw the line between generalising and keeping the code easy to understand?

    Read the article

  • Adding complexity by generalising: how far should you go?

    - by marcog
    Reference question: http://stackoverflow.com/questions/4303813/help-with-interview-question The above question asked to solve a problem for an NxN matrix. While there was an easy solution, I gave a more general solution to solve the more general problem for an NxM matrix. A handful of people commented that this generalisation was bad because it made the solution more complex. One such comment is voted +8. Putting aside the hard-to-explain voting effects on SO, there are two types of complexity to be considered here: Runtime complexity, i.e. how fast does the code run Code complexity, i.e. how difficult is the code to read and understand The question of runtime complexity is something that requires a better understanding of the input data today and what it might look like in the future, taking the various growth factors into account where necessary. The question of code complexity is the one I'm interested in here. By generalising the solution, we avoid having to rewrite it in the event that the constraints change. However, at the same time it can often result in complicating the code. In the reference question, the code for NxN is easy to understand for any competent programmer, but the NxM case (unless documented well) could easily confuse someone coming across the code for the first time. So, my question is this: Where should you draw the line between generalising and keeping the code easy to understand?

    Read the article

  • How to concatenate all commit messages from subversion into one text file with no metadata?

    - by user144182
    I would like to take all the commit messages in my subversion log and just concatenate them into one text file. Each commit message has this format: - r1 message - r1 message - r1 message What I would like is something like: - r1 message - r1 message - r2 message - r2 message - r3 message [...] - r1000 message Update I thought the above was clear, but what I don't want in the log is this type of info: r2130 | user| 2010-03-19 10:36:13 -0400 (Fri, 19 Mar 2010) | 1 line No meta data, I simply want the commit messages.

    Read the article

  • How to Upload a file from client to server using OFBIZ?

    - by SIVAKUMAR.J
    Hi all, Im new to ofbiz.So is my question is have any mistake forgive me for my mistakes.Im new to ofbiz so i did not know some terminologies in ofbiz.Sometimes my question is not clear because of lack of knowledge in ofbiz.So try to understand my question and give me a good solution with respect to my level.Because some solutions are in very high level cannot able to understand for me.So please give the solution with good examples. My problem is i created a project inside the ofbiz/hot-deploy folder namely "productionmgntSystem".Inside the folder "ofbiz\hot-deploy\productionmgntSystem\webapp\productionmgntSystem" i created a .ftl file namely "app_details_1.ftl" .The following are the coding of this file <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"> <title>Insert title here</title> <script TYPE="TEXT/JAVASCRIPT" language=""JAVASCRIPT"> function uploadFile() { //alert("Before calling upload.jsp"); window.location='<@ofbizUrl>testing_service1</@ofbizUrl>' } </script> </head> <!-- <form action="<@ofbizUrl>testing_service1</@ofbizUrl>" enctype="multipart/form-data" name="app_details_frm"> --> <form action="<@ofbizUrl>logout1</@ofbizUrl>" enctype="multipart/form-data" name="app_details_frm"> <center style="height: 299px; "> <table border="0" style="height: 177px; width: 788px"> <tr style="height: 115px; "> <td style="width: 103px; "> <td style="width: 413px; "><h1>APPLICATION DETAILS</h1> <td style="width: 55px; "> </tr> <tr> <td style="width: 125px; ">Application name : </td> <td> <input name="app_name_txt" id="txt_1" value=" " /> </td> </tr> <tr> <td style="width: 125px; ">Excell sheet &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;: </td> <td> <input type="file" name="filename"/> </td> </tr> <tr> <td> <!-- <input type="button" name="logout1_cmd" value="Logout" onclick="logout1()"/> --> <input type="submit" name="logout_cmd" value="logout"/> </td> <td> <!-- <input type="submit" name="upload_cmd" value="Submit" /> --> <input type="button" name="upload1_cmd" value="Upload" onclick="uploadFile()"/> </td> </tr> </table> </center> </form> </html> the following coding is present in the file "ofbiz\hot-deploy\productionmgntSystem\webapp\productionmgntSystem\WEB-INF\controller.xml" ...... ....... ........ <request-map uri="testing_service1"> <security https="true" auth="true"/> <event type="java" path="org.ofbiz.productionmgntSystem.web_app_req.WebServices1" invoke="testingService"/> <response name="ok" type="view" value="ok_view"/> <response name="exception" type="view" value="exception_view"/> </request-map> .......... ............ .......... <view-map name="ok_view" type="ftl" page="ok_view.ftl"/> <view-map name="exception_view" type="ftl" page="exception_view.ftl"/> ................ ............. ............. The following are the coding present in the file "ofbiz\hot-deploy\productionmgntSystem\src\org\ofbiz\productionmgntSystem\web_app_req\WebServices1.java" package org.ofbiz.productionmgntSystem.web_app_req; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import java.io.DataInputStream; import java.io.FileOutputStream; import java.io.IOException; public class WebServices1 { public static String testingService(HttpServletRequest request, HttpServletResponse response) { //int i=0; String result="ok"; System.out.println("\n\n\t*************************************\n\tInside WebServices1.testingService(HttpServletRequest request, HttpServletResponse response)- Start"); String contentType=request.getContentType(); System.out.println("\n\n\t*************************************\n\tInside WebServices1.testingService(HttpServletRequest request, HttpServletResponse response)- contentType : "+contentType); String str=new String(); // response.setContentType("text/html"); //PrintWriter writer; if ((contentType != null) && (contentType.indexOf("multipart/form-data") >= 0)) { System.out.println("\n\n\t**********************************\n\tInside WebServices1.testingService(HttpServletRequest request, HttpServletResponse response) after if (contentType != null)"); try { // writer=response.getWriter(); System.out.println("\n\n\t**********************************\n\tInside WebServices1.testingService(HttpServletRequest request, HttpServletResponse response) - try Start"); DataInputStream in = new DataInputStream(request.getInputStream()); int formDataLength = request.getContentLength(); byte dataBytes[] = new byte[formDataLength]; int byteRead = 0; int totalBytesRead = 0; //this loop converting the uploaded file into byte code while (totalBytesRead < formDataLength) { byteRead = in.read(dataBytes, totalBytesRead,formDataLength); totalBytesRead += byteRead; } String file = new String(dataBytes); //for saving the file name String saveFile = file.substring(file.indexOf("filename=\"") + 10); saveFile = saveFile.substring(0, saveFile.indexOf("\n")); saveFile = saveFile.substring(saveFile.lastIndexOf("\\")+ 1,saveFile.indexOf("\"")); int lastIndex = contentType.lastIndexOf("="); String boundary = contentType.substring(lastIndex + 1,contentType.length()); int pos; //extracting the index of file pos = file.indexOf("filename=\""); pos = file.indexOf("\n", pos) + 1; pos = file.indexOf("\n", pos) + 1; pos = file.indexOf("\n", pos) + 1; int boundaryLocation = file.indexOf(boundary, pos) - 4; int startPos = ((file.substring(0, pos)).getBytes()).length; int endPos = ((file.substring(0, boundaryLocation)).getBytes()).length; //creating a new file with the same name and writing the content in new file FileOutputStream fileOut = new FileOutputStream("/"+saveFile); fileOut.write(dataBytes, startPos, (endPos - startPos)); fileOut.flush(); fileOut.close(); System.out.println("\n\n\t**********************************\n\tInside WebServices1.testingService(HttpServletRequest request, HttpServletResponse response) - try End"); } catch(IOException ioe) { System.out.println("\n\n\t*********************************\n\tInside WebServices1.testingService(HttpServletRequest request, HttpServletResponse response) - Catch IOException"); //ioe.printStackTrace(); return("exception"); } catch(Exception ex) { System.out.println("\n\n\t*********************************\n\tInside WebServices1.testingService(HttpServletRequest request, HttpServletResponse response) - Catch Exception"); return("exception"); } } else { System.out.println("\n\n\t********************************\n\tInside WebServices1.testingService(HttpServletRequest request, HttpServletResponse response) else part"); result="exception"; } System.out.println("\n\n\t*************************************\n\tInside WebServices1.testingService(HttpServletRequest request, HttpServletResponse response)- End"); return(result); } } I want to upload a file to the server.The file is get from user "<input type="file"..> tag in the "app_details_1.ftl" file & it is updated into the server by using the method "testingService(HttpServletRequest request, HttpServletResponse response)" in the class "WebServices1".But the file is not uploaded. Give me a good solution for uploading a file to the server. Thanks & Regards, Sivakumar.J

    Read the article

  • Java compiler error: "cannot find symbol" when trying to access local variable

    - by HH
    $ javac GetAllDirs.java GetAllDirs.java:16: cannot find symbol symbol : variable checkFile location: class GetAllDirs System.out.println(checkFile.getName()); ^ 1 error $ cat GetAllDirs.java import java.util.*; import java.io.*; public class GetAllDirs { public void getAllDirs(File file) { if(file.isDirectory()){ System.out.println(file.getName()); File checkFile = new File(file.getCanonicalPath()); }else if(file.isFile()){ System.out.println(file.getName()); File checkFile = new File(file.getParent()); }else{ // checkFile should get Initialized at least HERE! File checkFile = file; } System.out.println(file.getName()); // WHY ERROR HERE: checkfile not found System.out.println(checkFile.getName()); } public static void main(String[] args) { GetAllDirs dirs = new GetAllDirs(); File current = new File("."); dirs.getAllDirs(current); } }

    Read the article

  • Visual Studio 2008 project file does not load because of an unexpected encoding change.

    - by Xenan
    In our team we have a database project in visual Studio 2008 which is under source control by Team Foundation Server. Every two weeks or so, after one co-worker checks in, the project file won't load on the other developers machines. The error message is: The project file could not be loaded. Data at the root level is invalid. Line 1, position 1. When I look at the project file in Notepad++, the file looks like this: ??<NUL?NULxNULmNULlNUL NULvNULeNULrNULsNULiNULoNULnNUL ... and so on (you can see <?xml version in this) whereas an normal project file looks like: <?xml version="1.0" encoding="utf-16"?> ... So probably something is wrong with the encoding of the file. This is a problem for us because it turns out to be impossible to get the file encoding correct again. The 'solution' is to throw away the project file an get the last know working version from source control. According to the file, the encoding should be UTF-16. According to Notepad++, the corrupted file is actually UTF-8. My questions are: Why is Visual Studio messing up the encoding of the project file, apparently at random times and at random machines? What should we do to prevent this? When it has happened, is there a possibility to restore the current file in the correct encoding instead of pulling an older version from source control? As a last note: the problem is with one single project file, all other project files don't expose this problem.

    Read the article

  • File.Move does not inherit permissions from target directory?

    - by Joseph Kingry
    In case something goes wrong in creating a file, I've been writing to a temporary file and then moving to the destination. Something like: var destination = @"C:\foo\bar.txt"; var tempFile = Path.GetTempFileName(); using (var stream = File.OpenWrite(tempFile)) { // write to file here here } string backupFile = null; try { var dir = Path.GetDirectoryName(destination); if (!Directory.Exists(dir)) { Directory.CreateDirectory(dir); Util.SetPermissions(dir); } if (File.Exists(destination)) { backupFile = Path.Combine(Path.GetTempPath(), new Guid().ToString()); File.Move(destination, backupFile); } File.Move(tempFile, destination); if (backupFile != null) { File.Delete(backupFile); } } catch(IOException) { if(backupFile != null && !File.Exists(destination) && File.Exists(backupFile)) { File.Move(backupFile, destination); } } The problem is that the new "bar.txt" in this case does not inherit permissions from the "C:\foo" directory. Yet if I create a file via explorer/notepad etc directly in the "C:\foo" there's no issues, so I believe the permissions are correctly set on "C:\foo". Update Found Inherited permissions are not automatically updated when you move folders, maybe it applies to folders as well. Now looking for a way to force an update of file permissions. Is there a better way overall of doing this?

    Read the article

  • Can't access my files in ASP.NET web site

    - by jumbojs
    I'm having a very difficult time. I am running windows 2008 server, I have an Able Commerce site using ASP.NET with C#. I'm writing an automated task that will ftp some xml files down into a local directory on our web server and then the program parses the xml file and saves information to our database. The problem, once I save the files to our local directory, my program has no access to the files. The NETWORK SERVICE user permissions isn't being inherited by the xml files so my program can't do anything with them. I can manually change the permissions, but this wouldn't be automated and won't work. How can I get this to work? help please, it's very frustrating.

    Read the article

  • XML File as Excel file.

    - by FrustratedWithFormsDesigner
    I have a number of reports that I run against my database that need to eventually go to the end-users as Excel spreadsheets. Initially, I was creating text reports, but the steps to convert the text to a spreadsheet were a bit cumbersome. There were too many steps to import text to the spreadsheet, and multi-line text rows were imported as individual rows in Excel (which was incorrect). Currently, I am generating simple XML saving the file with an ".xls" extension. This works better, but there is still the problem of Excel prompting the user with an XML import dialogue every time they open the file, and then having to save a new file if they add notes or change the layout to the file (which they almost certainly will be doing). Sample "xls" file: <?xml version="1.0" standalone="yes"?> <report_rows> <row> <NAME>Test Data</NAME> <COUNT>345</COUNT> </row> <!-- many more row elements... --> </report_rows> Is there any way to add markup to the file to hint to Excel how it should import and handle the file? Ideally, the end user should be able to open and save the file like any othe spreadsheet they create directly from Excel. Is this even possible? UPDATE: We are running Office 2003 here. UPDATE: The XML is generated from a sqlplus script, no option to use C#/.NET here.

    Read the article

  • What would happen if a same file being read and appended at the same time(python programming)?

    - by Shane
    I'm writing a script using two separate thread one doing file reading operation and the other doing appending, both threads run fairly frequently. My question is, if one thread happens to read the file while the other is just in the middle of appending strings such as "This is a test" into this file, what would happen? I know if you are appending a smaller-than-buffer string, no matter how frequently you read the file in other threads, there would never be incomplete line such as "This i" appearing in your read file, I mean the os would either do: append "This is a test" - read info from the file; or: read info from the file - append "This is a test" to the file; and such would never happen: append "This i" - read info from the file - append "s a test". But if "This is a test" is big enough(assuming it's a bigger-than-buffer string), the os can't do appending job in one operation, so the appending job would be divided into two: first append "This i" to the file, then append "s a test", so in this kind of situation if I happen to read the file in the middle of the whole appending operation, would I get such result: append "This i" - read info from the file - append "s a test", which means I might read a file that includes an incomplete string?

    Read the article

  • Tracking down rogue disk usage

    - by Amadan
    I found several other questions regarding the theory behind my problem (e.g. this, this), but I don't know how to apply the answers to my machine. # du -hsx / 11000283 / # df -kT / Filesystem Type 1K-blocks Used Available Use% Mounted on /dev/mapper/csisv13-root ext4 516032952 361387456 128432532 74% / There is a big difference between 11G (du) and 345G (df). Where are the remaining 334G? It's not in deleted files. There was only one, it was short, and I truncated it just in case. This is what remains: # lsof -a +L1 / COMMAND PID USER FD TYPE DEVICE SIZE/OFF NLINK NODE NAME zabbix_ag 4902 zabbix 1w REG 252,0 0 0 28836028 /var/log/zabbix-agent/zabbix_agentd.log.1 (deleted) zabbix_ag 4902 zabbix 2w REG 252,0 0 0 28836028 /var/log/zabbix-agent/zabbix_agentd.log.1 (deleted) zabbix_ag 4906 zabbix 1w REG 252,0 0 0 28836028 /var/log/zabbix-agent/zabbix_agentd.log.1 (deleted) zabbix_ag 4906 zabbix 2w REG 252,0 0 0 28836028 /var/log/zabbix-agent/zabbix_agentd.log.1 (deleted) zabbix_ag 4907 zabbix 1w REG 252,0 0 0 28836028 /var/log/zabbix-agent/zabbix_agentd.log.1 (deleted) zabbix_ag 4907 zabbix 2w REG 252,0 0 0 28836028 /var/log/zabbix-agent/zabbix_agentd.log.1 (deleted) zabbix_ag 4908 zabbix 1w REG 252,0 0 0 28836028 /var/log/zabbix-agent/zabbix_agentd.log.1 (deleted) zabbix_ag 4908 zabbix 2w REG 252,0 0 0 28836028 /var/log/zabbix-agent/zabbix_agentd.log.1 (deleted) zabbix_ag 4909 zabbix 1w REG 252,0 0 0 28836028 /var/log/zabbix-agent/zabbix_agentd.log.1 (deleted) zabbix_ag 4909 zabbix 2w REG 252,0 0 0 28836028 /var/log/zabbix-agent/zabbix_agentd.log.1 (deleted) zabbix_ag 4910 zabbix 1w REG 252,0 0 0 28836028 /var/log/zabbix-agent/zabbix_agentd.log.1 (deleted) zabbix_ag 4910 zabbix 2w REG 252,0 0 0 28836028 /var/log/zabbix-agent/zabbix_agentd.log.1 (deleted) I rebooted to see if fsck does anything. But, from /var/log/boot.log, it seems there are no issues: /dev/mapper/server-root: clean, 3936097/32768000 files, 125368568/131064832 blocks Thinking maybe someone overzealously reserved root space, I checked the master record: # tune2fs -l /dev/mapper/server-root tune2fs 1.42 (29-Nov-2011) Filesystem volume name: <none> Last mounted on: / Filesystem UUID: 86430ade-cea7-46ce-979c-41769a41ecbe Filesystem magic number: 0xEF53 Filesystem revision #: 1 (dynamic) Filesystem features: has_journal ext_attr resize_inode dir_index filetype needs_recovery extent flex_bg sparse_super large_file huge_file uninit_bg dir_nlink extra_isize Filesystem flags: signed_directory_hash Default mount options: user_xattr acl Filesystem state: clean Errors behavior: Continue Filesystem OS type: Linux Inode count: 32768000 Block count: 131064832 Reserved block count: 6553241 Free blocks: 5696264 Free inodes: 28831903 First block: 0 Block size: 4096 Fragment size: 4096 Reserved GDT blocks: 992 Blocks per group: 32768 Fragments per group: 32768 Inodes per group: 8192 Inode blocks per group: 512 Flex block group size: 16 Filesystem created: Fri Feb 1 13:44:04 2013 Last mount time: Tue Aug 19 16:56:13 2014 Last write time: Fri Feb 1 13:51:28 2013 Mount count: 9 Maximum mount count: -1 Last checked: Fri Feb 1 13:44:04 2013 Check interval: 0 (<none>) Lifetime writes: 1215 GB Reserved blocks uid: 0 (user root) Reserved blocks gid: 0 (group root) First inode: 11 Inode size: 256 Required extra isize: 28 Desired extra isize: 28 Journal inode: 8 First orphan inode: 28836028 Default directory hash: half_md4 Directory Hash Seed: bca55ff5-f530-48d1-8347-25c004f66d43 Journal backup: inode blocks The system is: # uname -a Linux server 3.2.0-67-generic #101-Ubuntu SMP Tue Jul 15 17:46:11 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux # cat /etc/lsb-release DISTRIB_ID=Ubuntu DISTRIB_RELEASE=12.04 DISTRIB_CODENAME=precise DISTRIB_DESCRIPTION="Ubuntu 12.04.2 LTS" Does anyone have any tips on what exactly to do to find and hopefully reclaim the missing space?

    Read the article

< Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >