Search Results

Search found 72722 results on 2909 pages for 'file processing'.

Page 403/2909 | < Previous Page | 399 400 401 402 403 404 405 406 407 408 409 410  | Next Page >

  • Android - Loop Through strings.xml file

    - by Alexis Cartier
    I was wondering if there is anyway to loop through the strings.xml file. Let's say that I have the following format: <!-- FIRST SECTION --> <string name="change_password">Change Password</string> <string name="change_server">Change URL</string> <string name="default_password">password</string> <string name="default_server">http://xxx:8080</string> <string name="default_username">testPhoneAccount</string> <!-- SECOND SECTION --> <string name="debug_settings_category">Debug Settings</string> <string name="reload_data_every_startup_pref">reload_data_every_startup</string> <string name="reload_data_on_first_startup_pref">reload_data_on_first_startup</string> Now let's say I have this: private HashMap<String,Integer> hashmapStringValues = new HashMap<String, Integer>(); Is there a way to iterate only in the second section of my xml file? Maybe wrap the section with a tag like <section2> and then iterate through it? public void initHashMap(){ for (int i=0;i< ???? ;i++) //Here I need to loop only in the second section of my xml file { String nameOfTag = ? // Here I get the name of the tag int value = R.string.nameOfTag // Here I get the associated value of the tag this.hashmapStringValues.put(nameOfTag,value); } }

    Read the article

  • File based caching under PHP

    - by azatoth
    I've been using http://code.google.com/p/phpbrowscap/ for a project, and it usually works nice. But a few times it's cache, which is plain php-files (see http://code.google.com/p/phpbrowscap/source/browse/trunk/browscap/Browscap.php#372 et. al.), has been "zeroed", i.e. the whole cache file has become large blob of NULLs. Instead of trying to find out why the files become NULL, I though perhaps it might be better to change the caching strategy to something more resilient. So I do wonder if you has any good ideas what would be a good solution; I've been looking at http://www.jongales.com/blog/2009/02/18/simple-file-based-php-cache-class/ and http://www.phpclasses.org/package/313-PHP-Cache-arbitrary-data-in-files-.html and I also though of just saving an serialized array to the file instead of pure php as it's been doing now; But I'm uncertain what approach I should target here. I'm grateful for any insight into this area of technology, as I know it's complex from a performance point of view.

    Read the article

  • Getting zeros between data while reading a binary file in C

    - by indiajoe
    I have a binary data which I am reading into an array of long integers using a C programme. hexdump of the binary data shows, that after first few data points , it starts again at a location 20000 hexa adresses away. hexdump output is as shown below. 0000000 0000 0000 0000 0000 0000 0000 0000 0000 * 0020000 0000 0000 0053 0000 0064 0000 006b 0000 0020010 0066 0000 0068 0000 0066 0000 005d 0000 0020020 0087 0000 0059 0000 0062 0000 0066 0000 ........ and so on... But when I read it into an array 'data' of long integers. by the typical fread command fread(data,sizeof(*data),filelength/sizeof(*data),fd); It is filling up with all zeros in my data array till it reaches the 20000 location. After that it reads in data correctly. Why is it reading regions where my file is not there? Or how will I make it read only my file, not anything inbetween which are not in file? I know it looks like a trivial problem, but I cannot figure it out even after googling one night.. Can anyone suggest me where I am doing it wrong? Other Info : I am working on a gnu/linux machine. (slax-atma distro to be specific) My C compiler is gcc.

    Read the article

  • Using bash to copy file to spec folders

    - by Franko
    I have a folder with a fair amount of subfolders. In some of the subfolders do I have a folder.jpg picture. What I try to do find that folder(s) and copy it to all other subfolders that got the same artist and album information then continue on to the next album etc. The structure of all the folders are "artist - year - album - [encoding information]". I have made a really simple one liner that find the folders that got the file but there am I stuck. ls -F | grep / | while read folders;do find "$folders" -name folder.jpg; done Anyone have any good tip or ideas how to solve this or pointers how to proceed? Edit: First of all, i´m real new to this (like you cant tell) so please have patience. Ok, let me break it down even more. I have a folder structure that looks like this: artist1 - year - album - [flac] artist1 - year - album - [mp3] artist1 - year - album - [AAC] artist2 - year - album - [flac] etc I like to loop over the set of folders that have the same artist and album information and look for a folder.jpg file. When I find that file do I like to copy it to all of the other folders in the same set. Ex if I find one folder.jpg in artist1 - year - album - [flac] folder do I like to have that folder.jpg copied to artist1 - year - album - [mp3] & artist1 - year - album - [AAC] but not to artist2 - year - album - [flac]. The continue the loop until all the sets been processed. I really hope that makes it a bit more easy to understand what I try to do :)

    Read the article

  • Still don't understand file upload-folder permissions

    - by Camran
    I have checked out articles and tutorials. I don't know what to do about the security of my picture upload-folder. It is pictures for classifieds which should be uploaded to the folder. This is what I want: Anybody may upload images to the folder. The images will be moved to another folder, by another php-code later on (automatic). Only I may manually remove them, as well as another php file on the server which automatically empties the folder after x-days. What should I do here? The images are uploaded via a php-upload script. This script checks to see if the extension of the file is actually a valid image-file. When I try this: chmod 755 images the images wont be uploaded. But like this it works: chmod 777 images But 777 is a security risk right? Please give me detailed information... The Q is, what to do to solve this problem, not info about what permissions there are etc etc... Thanks If you need more info let me know...

    Read the article

  • JavaFX: File upload to REST service / servlet fails because of missing boundary

    - by spa
    I'm trying to upload a file using JavaFX using the HttpRequest. For this purpose I have written the following function. function uploadFile(inputFile : File) : Void { // check file if (inputFile == null or not(inputFile.exists()) or inputFile.isDirectory()) { return; } def httpRequest : HttpRequest = HttpRequest { location: urlConverter.encodeURL("{serverUrl}"); source: new FileInputStream(inputFile) method: HttpRequest.POST headers: [ HttpHeader { name: HttpHeader.CONTENT_TYPE value: "multipart/form-data" } ] } httpRequest.start(); } On the server side, I am trying to handle the incoming data using the Apache Commons FileUpload API using a Jersey REST service. The code used to do this is a simple copy of the FileUpload tutorial on the Apache homepage. @Path("Upload") public class UploadService { public static final String RC_OK = "OK"; public static final String RC_ERROR = "ERROR"; @POST @Produces("text/plain") public String handleFileUpload(@Context HttpServletRequest request) { if (!ServletFileUpload.isMultipartContent(request)) { return RC_ERROR; } FileItemFactory factory = new DiskFileItemFactory(); ServletFileUpload upload = new ServletFileUpload(factory); List<FileItem> items = null; try { items = upload.parseRequest(request); } catch (FileUploadException e) { e.printStackTrace(); return RC_ERROR; } ... } } However, I get a exception at items = upload.parseRequest(request);: org.apache.commons.fileupload.FileUploadException: the request was rejected because no multipart boundary was found I guess I have to add a manual boundary info to the InputStream. Is there any easy solution to do this? Or are there even other solutions?

    Read the article

  • Using delayed_job to process file uploads across multiple servers

    - by Steve Klabnik
    Does anyone have any good resources on how to do this? Basically, I'm working on a project (in Rails) where people can upload files. They might be big. I'd like to process them using delayed_job before sending them to S3. I'd also like to do this processing on a separate job queue server, rather than on the webserver itself. I'd rather not have to upload the files to the webserver, then transfer them to the job queue server, and then upload them to S3 if I don't have to. Thanks.

    Read the article

  • Loading .dll/.exe from file into temporary AppDomain throws Exception

    Hi Gang, I am trying to make a Visual Studio AddIn that removes unused references from the projects in the current solution (I know this can be done, Resharper does it, but my client doesn't want to pay for 300 licences). Anyhoo, I use DTE to loop through the projects, compile their assemblies, then reflect over those assemblies to get their referenced assemblies and cross-examine the .csproj file. Problem: since the .dll/.exe I loaded up with Reflection doesn't unload until the app domian unloads, it is now locked and the projects can't be built again because VS tries to re-create the files (all standard stuff). I have tried creating temporary files, then reflecting over them...no worky, still have locked original files (I totally don’t understand that BTW). Now I am now going down the path of creating a temporary AppDomain to load the files into and then destroy. I am having problems loading the files though: The way I understand AddDomain.Load is that I should create and send a byte array of the assembly to it. I do that: FileStream fs = new FileStream(assemblyFile, FileMode.Open); byte[] assemblyFileBuffer = new byte[(int)fs.Length]; fs.Read(assemblyFileBuffer, 0, assemblyFileBuffer.Length); fs.Close(); AppDomainSetup domainSetup = new AppDomainSetup(); domainSetup.ApplicationBase = assemblyFileInfo.Directory.FullName; AppDomain tempAppDomain = AppDomain.CreateDomain("TempAppDomain", null, domainSetup); Assembly projectAssembly = tempAppDomain.Load(assemblyFileBuffer); The last line throws an exception: "Could not load file or assembly 'WindowsFormsApplication1, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' or one of its dependencies. The system cannot find the file specified.":"WindowsFormsApplication3, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null"}" Any help or thoughts would be greatly appreciated. My head is lopsided from beating it against the wall... Thanks, Dan

    Read the article

  • SmartGWT Server File Browser

    - by jluzwick
    Hi All: I have been looking for a SmartGWT example that would show me how to build a File Browser widget that takes the files from the local server's root directory. The user would be shown the files through the browser which they could then select to perform some processing operations. So far I have thought of using SmartGWT's Tree-Data Binding-Load from Local Data Widget and then grabbing a list of the directories using: new File("\").listFiles(); My Question is: Is there a better way to do this? Has someone already thought of this and has an example of their code that I can see? PS: I'm fairly new to GWT and Web Services but fairly competent with Java. If you believe there is a better way to do this (while still doing this through the web and not using Applets, please tell me). Thanks

    Read the article

  • Creating a tar file with checksums included

    - by wazoox
    Here's my problem : I need to archive to tar files a lot ( up to 60 TB) of big files (usually 30 to 40 GB each). I would like to make checksums ( md5, sha1, whatever) of these files before archiving; however not reading every file twice (once for checksumming, twice for tar'ing) is more or less a necessity to achieve a very high archiving performance (LTO-4 wants 120 MB/s sustained, and the backup window is limited). So I'd need some way to read a file, feeding a checksumming tool on one side, and building a tar to tape on the other side, something along : tar cf - files | tee tarfile.tar | md5sum - Except that I don't want the checksum of the whole archive (this sample shell code does just this) but a checksum for each individual file in the archive. I've studied GNU tar, Pax, Star options. I've looked at the source from Archive::Tar. I see no obvious way to achieve this. It looks like I'll have to hand-build something in C or similar to achieve what I need. Perl/Python/etc simply won't cut it performance-wise, and the various tar programs miss the necessary "plugin architecture". Does anyone know of any existing solution to this before I start code-churning ?

    Read the article

  • Saving NSString to file

    - by Michael Amici
    I am having trouble saving simple website data to file. I believe the sintax is correct, could someone help me? When I run it, it shows that it gets the data, but when i quit and open up the file that it is supposed to save to, nothing is in it. - (BOOL)textFieldShouldReturn:(UITextField *)nextField { [timer invalidate]; startButton.hidden = NO; startButton.enabled = YES; stopButton.enabled = NO; stopButton.hidden = YES; stopLabel.hidden = YES; label.hidden = NO; label.text = @"Press to Activate"; [nextField resignFirstResponder]; NSString *urlString = textField.text; NSData *dataNew = [NSData dataWithContentsOfURL:[NSURL URLWithString:urlString]]; NSUInteger len = [dataNew length]; NSString *stringCompare = [NSString stringWithFormat:@"%i", len]; NSLog(@"%@", stringCompare); NSString *filePath = [[NSBundle mainBundle] pathForResource:@"websiteone" ofType:@"txt"]; if (filePath) { [stringCompare writeToFile:filePath atomically:YES encoding:NSUTF8StringEncoding error:NULL]; NSString *myText = [NSString stringWithContentsOfFile:filePath encoding:NSUTF8StringEncoding error:NULL]; NSLog(@"Saving... %@", myText); } else { NSLog(@"cant find the file"); } return YES; }

    Read the article

  • looping problem while appending data to existing text file

    - by Manu
    try { stmt = conn.createStatement(); stmt1 = conn.createStatement(); stmt2 = conn.createStatement(); rs = stmt.executeQuery("select cust from trip1"); rs1 = stmt1.executeQuery("select cust from trip2"); rs2 = stmt2.executeQuery("select cust from trip3"); File f = new File(strFileGenLoc); OutputStream os = (OutputStream)new FileOutputStream(f,true); String encoding = "UTF8"; OutputStreamWriter osw = new OutputStreamWriter(os, encoding); BufferedWriter bw = new BufferedWriter(osw); } while ( rs.next() ) { while(rs1.next()){ while(rs2.next()){ bw.write(rs.getString(1)==null? "":rs.getString(1)); bw.write("\t"); bw.write(rs1.getString(1)==null? "":rs1.getString(1)); bw.write("\t"); bw.write(rs2.getString(1)==null? "":rs2.getString(1)); bw.write("\t"); bw.newLine(); } } } Above code working fine. My problem is 1. "rs" resultset contains one record in the table 2. "rs1" resultset contains 5 record in the table 3. "rs2" resultset contains 5 record in the table "rs" data is getting recursive. while writing to the same text file , the output i am getting like 1 2 3 1 12 21 1 23 25 1 10 5 1 8 54 but i need output like below 1 2 3 12 21 23 25 10 5 8 54 What things i need to change in my code.. Please advice

    Read the article

  • What can i use to journal writes to file system

    - by Dmitry
    Hello, all I need to track all writes to files in order to have synchronized version of files on different place (server or just other directory, not considerable). Let it: all files located in same directory feel free to create some system files (e.g. SomeFileName.Ext~temp-data) no one have concurrent access to synced directory; nobody spoil ours meta-files or change real-files before we do postponed writes (like a commits) do not to care recovering "local" changes in case of crash; system can just rolled back to state of "server" by simple copy from it significant to have it transparent to use (so programmer must just call ordinary fopen(), read(), write()) It must be guaranteed that copy of files which "server" have is consistent. That is whole files scope existed in some moment of time. They may be sufficiently outdated but it must be fair snapshot of all files at some time. As i understand i should overload writing logic to collect data in order sent changes to "server". For example writing to temporary File~tmp. And so i have to overload reads in order program could read actual data of file. It would be great if you suggest some existing library (java or c++, it is unimportant) or solution (VCS customizing?). Or give hints how should i write it by myself. edit: After some reading i have more precision requirements: I need COW (Copy-on-write) wrapper for fopen(),fwrite(),.. or interceptor (hook) WriteFile() and other FS api system calls. Log-structured file system in userspace would be a alternative too.

    Read the article

  • using .htaccess to redirect from friendly url to actual file

    - by Kohalza
    I have the following RewriteRule in my .htaccess to redirect from a friendly url to my main application file: RewriteRule ^\/(.*).html$ home/www/page.php?p=$1 [L] This should send any url that points to a html page to page.php with the url as a parameter that will be parsed by the app. This works for urls that look like http://www.example.com/hello.html The problem is that I get a 404 error when the url contains a directory path, for example: http://www.example.com/category/hello.html The error reads: "File does not exist: /home/www/category" Seems it is first looking for the 'category' path instead of processing the .htaccess Any ideas how to solve this?

    Read the article

  • Java: Embedding Soundbank file in JAR

    - by Pyroclastic
    If I have a soundbank stored in a JAR, how would I load that soundbank into my application using resource loading...? I'm trying to consolidate as much of a MIDI program into the jar file as I can, and the last thing I have to add is the soundbank file I'm using, as users won't have the soundbanks installed. I'm trying to put it into my jar file, and then load it with getResource() in the Class class, but I'm getting an InvalidMidiDataException on a soundbank that I know is valid. Here's the code, it's in the constructor for my synthesizer object: try { synth = MidiSystem.getSynthesizer(); channels = synth.getChannels(); instrument = MidiSystem.getSoundbank(this.getClass().getResource("img/soundbank-mid.gm")).getInstruments(); currentInstrument = instrument[0]; synth.loadInstrument(currentInstrument); synth.open(); } catch (InvalidMidiDataException ex) { System.out.println("FAIL"); instrument = synth.getAvailableInstruments(); currentInstrument = instrument[0]; synth.loadInstrument(currentInstrument); try { synth.open(); } catch (MidiUnavailableException ex1) { Logger.getLogger(MIDISynth.class.getName()).log(Level.SEVERE, null, ex1); } } catch (IOException ex) { Logger.getLogger(MIDISynth.class.getName()).log(Level.SEVERE, null, ex); } catch (MidiUnavailableException ex) { Logger.getLogger(MIDISynth.class.getName()).log(Level.SEVERE, null, ex); }

    Read the article

  • Parse a CSV file using python (to make a decision tree later)

    - by Margaret
    First off, full disclosure: This is going towards a uni assignment, so I don't want to receive code. :). I'm more looking for approaches; I'm very new to python, having read a book but not yet written any code. The entire task is to import the contents of a CSV file, create a decision tree from the contents of the CSV file (using the ID3 algorithm), and then parse a second CSV file to run against the tree. There's a big (understandable) preference to have it capable of dealing with different CSV files (I asked if we were allowed to hard code the column names, mostly to eliminate it as a possibility, and the answer was no). The CSV files are in a fairly standard format; the header row is marked with a # then the column names are displayed, and every row after that is a simple series of values. Example: # Column1, Column2, Column3, Column4 Value01, Value02, Value03, Value04 Value11, Value12, Value13, Value14 At the moment, I'm trying to work out the first part: parsing the CSV. To make the decisions for the decision tree, a dictionary structure seems like it's going to be the most logical; so I was thinking of doing something along these lines: Read in each line, character by character If the character is not a comma or a space Append character to temporary string If the character is a comma Append the temporary string to a list Empty string Once a line has been read Create a dictionary using the header row as the key (somehow!) Append that dictionary to a list However, if I do things that way, I'm not sure how to make a mapping between the keys and the values. I'm also wondering whether there is some way to perform an action on every dictionary in a list, since I'll need to be doing things to the effect of "Everyone return their values for columns Column1 and Column4, so I can count up who has what!" - I assume that there is some mechanism, but I don't think I know how to do it. Is a dictionary the best way to do it? Would I be better off doing things using some other data structure? If so, what?

    Read the article

  • Best way to do something when a runloop event is done processing?

    - by quixoto
    I have some processing in my Cocoa app that sometimes ends up calling through a hierarchy of data to do a bunch of work as the result of an event. Each small piece creates and destroys some resources. I don't want those resources around most of the time, but I would like to find a smart way of creating them before all the work and killing them at the end. Short of creating the resources up front and then passing them entirely down through the call hierarchy when work is done, is there a way to know locally in some code when an event loop run has ended? Then I could create them if they're not there, and keep them until the run loop ends, reusing them for any subsequent calls before that time. Thanks.

    Read the article

  • Reading strings and integers from .txt file and printing output as strings only

    - by screename71
    Hello, I'm new to C++, and I'm trying to write a short C++ program that reads lines of text from a file, with each line containing one integer key and one alphanumeric string value (no embedded whitespace). The number of lines is not known in advance, (i.e., keep reading lines until end of file is reached). The program needs to use the 'std::map' data structure to store integers and strings read from input (and to associate integers with strings). The program then needs to output string values (but not integer values) to standard output, 1 per line, sorted by integer key values (smallest to largest). So, for example, suppose I have a text file called "data.txt" which contains the following three lines: 10 dog -50 horse 0 cat -12 zebra 14 walrus The output should then be: horse zebra cat dog walrus I've pasted below the progress I've made so far on my C++ program: #include <fstream> #include <iostream> #include <map> using namespace std; using std::map; int main () { string name; signed int value; ifstream myfile ("data.txt"); while (! myfile.eof() ) { getline(myfile,name,'\n'); myfile >> value >> name; cout << name << endl; } return 0; myfile.close(); } Unfortunately, this produces the following incorrect output: horse cat zebra walrus If anyone has any tips, hints, suggestions, etc. on changes and revisions I need to make to the program to get it to work as needed, can you please let me know? Thanks!

    Read the article

  • Using a function found in a different file in a loop

    - by Anders
    This question is related to BuddyPress, and a follow-up question from this question I have a .csv-file with 790 rows and 3 columns where the first column is the group name, second is the group description and last (third) the slug. As far as I've been told I can use this code: <?php $groups = array(); if (($handle = fopen("groupData.csv", "r")) !== FALSE) { while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) { $group = array('group_id' = 'SOME ID', 'name' = $data[0], 'description' = $data[1], 'slug' = groups_check_slug(sanitize_title(esc_attr($data[2]))), 'date_created' = gmdate( "Y-m-d H:i:s" ), 'status' = 'public' ); $groups[] = $group; } fclose($handle); } foreach ($groups as $group) { groups_create_group($group); } With http://www.nomorepasting.com/getpaste.php?pasteid=35217 which is called bp-groups.php. The thing is that I can't make it work. I've created a new file with the code written above called groupgenerator.php uploaded the .csv file to the same folder and opened groupgenerator.php in my browser. But, i get this error: Fatal error: Call to undefined function groups_check_slug() in What am I doing wrong?

    Read the article

  • Improving the join of two wave file?

    - by kaki
    I have written a code for joining two wave files.It works fine when i am joining larger segments but as i need to join very small segments the clarity is not good. I have learned that the signal processing technique such a windowed join can be used to improve the joining of file. y[n] = w[n]s[n] Multiply value of signal at sample number n by the value of a windowing function hamming window w[n]= .54 - .46*cos(2*Pi*n)/L 0 I am not understanding how to get the value to signal at sample n and how to implement this?? the code i am using for joining is import wave m=['C:/begpython/S0001_0002.wav', 'C:/begpython/S0001_0001.wav'] i=1 a=m[i] infiles = [a, "C:/begpython/S0001_0002.wav", a] outfile = "C:/begpython/S0001_00367.wav" data= [] data1=[] for infile in infiles: w = wave.open(infile, 'rb') data1=[w.getnframes] data.append( [w.getparams(), w.readframes(w.getnframes())] ) #data1 = [ord(character) for character in data1] #print data1 #data1 = ''.join(chr(character) for character in data1) w.close() output = wave.open(outfile, 'wb') output.setparams(data[0][0]) output.writeframes(data[0][1]) output.writeframes(data[1][1]) output.writeframes(data[2][1]) output.close()

    Read the article

  • how to write silverlight threading function in another file or project

    - by Piyush
    I am using three tier architecture.I have SilverlightUI and UIController two projects.SilverlightUI contains only UI pages and controls while UIController contains all proxies of WCF services. Now I have created threads to update my controls dynamically and to do processing parallel.AS the requirement I want to define all functionality of threads in UIController projects.What should I do? Currenty what I am doing - private void Button_Click(object sender, RoutedEventArgs e) { StartThreads(); } private void StartThreads() { private Thread _thread1; _thread1 = new Thread(DoThread1); _thread1.Start(); } public static void DoThread1() { _data1.Dispatcher.BeginInvoke(delegate() { _data1.Text = _count1.ToString(); }); System.Threading.Thread.Sleep(1000); } I Want to write DoThread1() method in UIController project and call that function from here button_click()

    Read the article

  • Why do I get "Bad File Descriptor" when I try to read a file with Perl?

    - by Magicked
    I'm trying to read a binary file 40 bytes at a time, then check to see if all those bytes are 0x00, and if so ignore them. If not, it will write them back out to another file (basically just cutting out large blocks of null bytes). This may not be the most efficient way to do this, but I'm not worried about that. However, right now I'm getting a "Bad File Descriptor" error and I cannot figure out why. my $comp = "\x00" * 40; my $byte_count = 0; my $infile = "/home/magicked/image1"; my $outfile = "/home/magicked/image1_short"; open IN, "<$infile"; open OUT, ">$outfile"; binmode IN; binmode OUT; my ($buf, $data, $n); while (read (IN, $buf, 40)) { ### Problem is here ### $boo = 1; for ($i = 0; $i < 40; $i++) { if ($comp[$i] != $buf[$i]) { $i = 40; print OUT $buf; $byte_count += 40; } } } die "Problems! $!\n" if $!; close OUT; close IN; I marked with a comment where it is breaking. Thanks for any help!

    Read the article

  • Looking for a Regex to get SccTeamFoundationServer value from .sln file

    - by Arthur
    I am looking tor a Regex for C# to get SccTeamFoundationServer value from .sln file. Maybe someone has come across such need and found a solution. Could you help? File: Microsoft Visual Studio Solution File, Format Version 10.00 # Visual Studio 2008 Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "WebApplication", "WebApplication\WebApplication.csproj", "{AE0F6C02-1C8D-426D-AFA0-C07A52E6112F}" EndProject Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "ConsoleApplication", "ConsoleApplication\ConsoleApplication.csproj", "{2BD82C34-CF50-4559-A3CD-F85ACD657292}" EndProject Global GlobalSection(TeamFoundationVersionControl) = preSolution SccNumberOfProjects = 3 SccEnterpriseProvider = {4CA58AB2-18FA-4F8D-95D4-32DDF27D184C} SccTeamFoundationServer = http://ServerName:8080/ SccLocalPath0 = . SccProjectUniqueName1 = ConsoleApplication\\ConsoleApplication.csproj SccProjectName1 = ConsoleApplication SccLocalPath1 = ConsoleApplication SccProjectUniqueName2 = WebApplication\\WebApplication.csproj SccProjectName2 = WebApplication SccLocalPath2 = WebApplication EndGlobalSection GlobalSection(SolutionConfigurationPlatforms) = preSolution Debug|Any CPU = Debug|Any CPU Release|Any CPU = Release|Any CPU EndGlobalSection GlobalSection(ProjectConfigurationPlatforms) = postSolution {AE0F6C02-1C8D-426D-AFA0-C07A52E6112F}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {AE0F6C02-1C8D-426D-AFA0-C07A52E6112F}.Debug|Any CPU.Build.0 = Debug|Any CPU {AE0F6C02-1C8D-426D-AFA0-C07A52E6112F}.Release|Any CPU.ActiveCfg = Release|Any CPU {AE0F6C02-1C8D-426D-AFA0-C07A52E6112F}.Release|Any CPU.Build.0 = Release|Any CPU {2BD82C34-CF50-4559-A3CD-F85ACD657292}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {2BD82C34-CF50-4559-A3CD-F85ACD657292}.Debug|Any CPU.Build.0 = Debug|Any CPU {2BD82C34-CF50-4559-A3CD-F85ACD657292}.Release|Any CPU.ActiveCfg = Release|Any CPU {2BD82C34-CF50-4559-A3CD-F85ACD657292}.Release|Any CPU.Build.0 = Release|Any CPU EndGlobalSection GlobalSection(SolutionProperties) = preSolution HideSolutionNode = FALSE EndGlobalSection EndGlobal

    Read the article

  • Import module stored in a cStringIO data structure vs. physical disk file

    - by Malcolm
    Is there a way to import a Python module stored in a cStringIO data structure vs. physical disk file? It looks like "imp.load_compiled(name, pathname[, file])" is what I need, but the description of this method (and similar methods) has the following disclaimer: Quote: "The file argument is the byte-compiled code file, open for reading in binary mode, from the beginning. It must currently be a real file object, not a user-defined class emulating a file." [1] I tried using a cStringIO object vs. a real file object, but the help documentation is correct - only a real file object can be used. Any ideas on why these modules would impose such a restriction or is this just an historical artifact? Are there any techniques I can use to avoid this physical file requirement? Thanks, Malcolm [1] http://docs.python.org/library/imp.html#imp.load_module

    Read the article

  • Replacing a word in a text file with a value using python

    - by Jamde Jam
    I have been trying to replace a word in a text file with a value (say 1), but my outfile is blank.I am new to python (its only been a month since I have been learning it). My file is relatively large, but I just want to replace a word with the value 1 for now. Here is a segment of what the file looks like: NAME SECOND_1 ATOM 1 6 0 0 0 # ORB 1 ATOM 2 2 0 12/24 0 # ORB 2 ATOM 3 2 12/24 0 0 # ORB 2 ATOM 4 2 0 0 4/24 # ORB 3 ATOM 5 2 0 0 20/24 # ORB 3 ATOM 6 2 0 0 8/24 # ORB 3 ATOM 7 2 0 0 16/24 # ORB 3 ATOM 8 6 0 0 12/24 # ORB 1 ATOM 9 2 12/24 0 12/24 # ORB 2 ATOM 10 2 0 12/24 12/24 # ORB 2 #1 #2 #3 I want to first replace the word ATOM with the value 1. Next I want to replace #ORB with a space. Here is what I am trying thus far. input = open('SECOND_orbitsJ22.txt','r') output=open('SECOND_orbitsJ22_out.txt','w') for line in input: word=line.split(',') if(word[0]=='ATOM'): word[0]='1' output.write(','.join(word)) Can anyone offer any suggestions or help? Thanks so much.

    Read the article

< Previous Page | 399 400 401 402 403 404 405 406 407 408 409 410  | Next Page >