Search Results

Search found 4705 results on 189 pages for 'export to csv'.

Page 6/189 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • CSV file read fail (PHP )

    - by user1020069
    I am trying to read a csv file (delimited by commas) but unfortunately, it isn't responding as it ought to. I am not so sure what I am doing wrong here, but I'll paste out the contents of the code and the CSV file both : $row = 0; if($handle = fopen("SampleQuizData.csv","r") !== FALSE) { // WORKS UNTIL HERE, SO FILE IS BEING READ while(!feof(handle)){ $line = fgetcsv($handle, 1024, ",") ; echo $line[2]; // DOES NOT WORK } } And the CSV file is (the emails and names have been changed here to protect the identities of the users) parijat,something,[email protected] matthew,durp, [email protected] steve,vai,[email protected] rajni,kanth,[email protected]

    Read the article

  • How can i add encoding to the python generated CSV file

    - by user1958218
    I am following this post http://stackoverflow.com/a/9016545 and i want to know that how can i do that in Python. I don't know how can i insert BOM data in there This is my current code response = HttpResponse(content_type='text/csv') response['Content-Type'] = 'application/octet-stream' response['Content-Disposition'] = 'attachment; filename="results.csv"' writer = UnicodeWriter(response, quoting=csv.QUOTE_ALL, encoding="utf-8") I want to convert to utf -16 . BOm data is this but don't know how to insert it From here http://stackoverflow.com/a/4440143 echo "\xEF\xBB\xBF"; // UTF-8 BOM But i want it for python and utf-16 I tried opening that csv in notepad and insert \xef\xbb\xb in beginning and excel displayed that correctly. But it is also visible before first column. How can i hide that because user wont like that

    Read the article

  • 50 million+ Rows of Data - CSV or MySQL

    - by eWizardII
    Hello, I have a CSV file which is about 1GB big and contains about 50million rows of data, I am wondering is it better to keep it as a CSV file or store it as some form of a database. I don't know a great deal about MySQL to argue for why I should use it or another database framework over just keeping it as a CSV file. I am basically doing a Breadth-First Search with this dataset, so once I get the initial "seed" set the 50million I use this as the first values in my queue. Thanks,

    Read the article

  • Adding new "columns" to csv data file in Tcl

    - by George
    Hi All, I am dealing with a "large" measurement data, approximately 30K key-value pairs. The measurements have number of iterations. After each iteration a datafile (non-csv) with 30K kay-value pairs is created. I want to somehow creata a csv file of form: Key1,value of iteration1,value of iteration2,... Key2,value of iteration1,value of iteration2,... Key2,value of iteration1,value of iteration2,... ... Now, I was wondering about efficient way of adding each iteration mesurement data as a columns to csv file in Tcl. So, far it seems that in either case I will need to load whole csv file into some variable(array/list) and work on each element by adding new measurement data. This seems somewhat inefficient. Is there another way, perhaps?

    Read the article

  • Clarification needed about Python CSV file format parsing

    - by HH
    Format is like: CHINA;2002-06-25 00:00:00.000;5,60 CHINA;2002-06-26 00:00:00.000;5,32 CHINA;2002-06-27 00:00:00.000;5,31 and I try to use Python's CSV tools to parse it but cannot understand the paragraph, source: And while the module doesn’t directly support parsing strings, it can easily be done: import csv for row in csv.reader(['one,two,three']): print row Could someone clarify the line ['one,two,three']? How would you use it with format A;B;C?

    Read the article

  • export shared services from MOSS

    - by vittocia
    Hello, using the stsadm command I have been able to export a MOSS website and restore it on a different server which works fine. I tried the same for the shared services, it gave no errors, but it does not have all the import connections when I check around. Is there a better way to export and restore shared services, or a way to synch the import connections and user list?

    Read the article

  • Outlook Shared Mailbox automatic calendar export

    - by Arthur
    I am aware that the shared mailbox feature is an exclusive microsoft feature in exchange and does not work on any non microsoft products. I am trying to create a workaround so am looking for a way to automatically export a calendar by schedule or any other means. Does anybody know any good Outlook plugins that would do something like that? it must export either in csv or iCal or some kind of other readable format.

    Read the article

  • How to export opened tabs in Chrome?

    - by Ieyasu Sawada
    Are there any extensions for Chrome that allow me to export all currently opened tabs as a text file, containing all the URLs of those tabs? I don't necessarily need it to be a text file if there is another way that you can think of. My goal is to share the URLs with someone via email. I'm currently using Session Manager to save my open tabs but it has no functionality to export them as described above.

    Read the article

  • How do I export a large table into 50 smaller csv files of 100,000 records each

    - by Eddie
    I am trying to export one field from a very large table - containing 5,000,000 records, for example - into a csv list - but not all together, rather, 100,000 records into each .csv file created - without duplication. How can I do this, please? I tried SELECT field_name FROM table_name WHERE certain_conditions_are_met INTO OUTFILE /tmp/name_of_export_file_for_first_100000_records.csv LINES TERMINATED BY '\n' LIMIT 0 , 100000 that gives the first 100000 records, but nothing I do has the other 4,900,000 records exported into 49 other files - and how do I specify the other 49 filenames? for example, I tried the following, but the SQL syntax is wrong: SELECT field_name FROM table_name WHERE certain_conditions_are_met INTO OUTFILE /home/user/Eddie/name_of_export_file_for_first_100000_records.csv LINES TERMINATED BY '\n' LIMIT 0 , 100000 INTO OUTFILE /home/user/Eddie/name_of_export_file_for_second_100000_records.csv LINES TERMINATED BY '\n' LIMIT 100001 , 200000 and that did not create the second file... what am I doing wrong, please, and is there a better way to do this? Should the LIMIT 0 , 100000 be put Before the first INTO OUTFILE statement, and then repeat the entire command from SELECT for the second 100,000 records, etc? Thanks for any help. Eddie

    Read the article

  • Java: CSV file read & write.

    - by battousai622
    Im reading in 2 csv file: store_inventory & new_acquisitions... I want to be able to compare the store_inventory csv file with new_acquisitions. 1) If the item names match just update the quantity in store_inventory. 2) If new_acquisitions has a new item that does not exist in store_inventory, then add it to the store_inventory. Heres what i have so far but its not very good. I added comments where i need to add taks 1) & 2). Any advice or code would be great, thanks. File new_acq = new File("/src/test/new_acquisitions.csv"); Scanner acq_scan = null; try { acq_scan = new Scanner(new_acq); } catch (FileNotFoundException ex) { Logger.getLogger(mainpage.class.getName()).log(Level.SEVERE, null, ex); } String itemName; int quantity; Double cost; Double price; File store_inv = new File("/src/test/store_inventory.csv"); Scanner invscan = null; try { invscan = new Scanner(store_inv); } catch (FileNotFoundException ex) { Logger.getLogger(mainpage.class.getName()).log(Level.SEVERE, null, ex); } String itemNameInv; int quantityInv; Double costInv; Double priceInv; while (acq_scan.hasNext()) { String line = acq_scan.nextLine(); if (line.charAt(0) == '#') { continue; } String[] split = line.split(","); itemName = split[0]; quantity = Integer.parseInt(split[1]); cost = Double.parseDouble(split[2]); price = Double.parseDouble(split[3]); while(invscan.hasNext()) { String line2 = invscan.nextLine(); if (line2.charAt(0) == '#') { continue; } String[] split2 = line2.split(","); itemNameInv = split2[0]; quantityInv = Integer.parseInt(split2[1]); costInv = Double.parseDouble(split2[2]); priceInv = Double.parseDouble(split2[3]); if(itemName == itemNameInv) { //update quantity } } //add new entry into csv file } Thanks again for any help. =]

    Read the article

  • Import csv data (SDK iphone)

    - by Ni
    I am new to cocoa. I have been working on these stuff for a few days. For the following code, i can read all the data in the string, and successfully get the data for plot. NSMutableArray *contentArray = [NSMutableArray array]; NSString *filePath = @"995,995,995,995,995,995,995,995,1000,997,995,994,992,993,992,989,988,987,990,993,989"; NSArray *myText = [filePath componentsSeparatedByString:@","]; NSInteger idx; for (idx = 0; idx < myText.count; idx++) { NSString *data =[myText objectAtIndex:idx]; NSLog(@"%@", data); id x = [NSNumber numberWithFloat:0+idx*0.002777778]; id y = [NSDecimalNumber decimalNumberWithString:data]; [contentArray addObject: [NSMutableDictionary dictionaryWithObjectsAndKeys:x, @"x", y, @"y", nil]]; } self.dataForPlot = contentArray; then, i try to load the data from csv file. the data in Data.csv file has the same value and the same format as 995,995,995,995,995,995,995,995,1000,997,995,994,992,993,992,989,988,987,990,993,989. I run the code, it is supposed to give the same graph output. however, it seems that the data is not loaded from csv file successfully. i can not figure out what's wrong with my code. NSMutableArray *contentArray = [NSMutableArray array]; NSString *filePath = [[NSBundle mainBundle] pathForResource:@"Data" ofType:@"csv"]; NSString *Data = [NSString stringWithContentsOfFile:filePath encoding:NSUTF8StringEncoding error:nil ]; if (Data) { NSArray *myText = [Data componentsSeparatedByString:@","]; NSInteger idx; for (idx = 0; idx < myText.count; idx++) { NSString *data =[myText objectAtIndex:idx]; NSLog(@"%@", data); id x = [NSNumber numberWithFloat:0+idx*0.002777778]; id y = [NSDecimalNumber decimalNumberWithString:data]; [contentArray addObject: [NSMutableDictionary dictionaryWithObjectsAndKeys:x, @"x", y, @"y",nil]]; } self.dataForPlot = contentArray; } The only difference is NSString *filePath = [[NSBundle mainBundle] pathForResource:@"Data" ofType:@"csv"]; NSString *Data = [NSString stringWithContentsOfFile:filePath encoding:NSUTF8StringEncoding error:nil ]; if (data){ } did i do anything wrong here?? Thanks for your help!!!!

    Read the article

  • Joomla - Force File Download / CSV Export

    - by lautaro.dragan
    I'm in need of help... this is my first time asking a question in SO, so please be kind :) I'm trying to force-download a file from php, so when the user hits a certain button, he gets a file download. The file is a csv (email, username) of all registered users. I decided to add this button to the admin users screen, as you can see in this screenshot. So I added the following code to the addToolbar function in administrator/components/com_users/views/users/view.html.php: JToolBarHelper::custom('users.export', 'export.png', 'export_f2.png', 'Exportar', false); This button is mapped to the following function in the com_users\controller\users.php controller: public function exportAllUsers() { ob_end_clean(); $app = JFactory::getApplication(); header("Content-type: text/csv"); header("Content-Disposition: attachment; filename=ideary_users.csv"); header("Pragma: no-cache"); header("Expires: 0"); echo "email,name\n"; $model = $this->getModel("Users"); $users = $model->getAllUsers(); foreach ($users as $user) { echo $user->email . ", " . ucwords(trim($user->name)) . "\r\n"; } $app->close(); } Now, this is actually working perfectly fine. The issue here is that after I download a file, if I hit any button in the admin that causes a POST, instead of it performing the action it should, it just downloads the file over again! For example: I hit the "Export" button "users.csv" downloads Then, I hit the "search" button "users.csv" downloads... what the hell? I'm guessing that when I hit the export button, a JS gets called and sets a form's action attribute to an URL... and expects a response or something, and then other button's are prevented from re-setting the form's action attribute. I can't think of any real solution for this, but I'd rather avoid hacks if possible. So, what would be the standard, elegant solution that joomla offers in this case?

    Read the article

  • Parsing CSV File to MySQL DB in PHP

    - by Austin
    I have a some 350-lined CSV File with all sorts of vendors that fall into Clothes, Tools, Entertainment, etc.. categories. Using the following code I have been able to print out my CSV File. <?php $fp = fopen('promo_catalog_expanded.csv', 'r'); echo '<tr><td>'; echo implode('</td><td>', fgetcsv($fp, 4096, ',')); echo '</td></tr>'; while(!feof($fp)) { list($cat, $var, $name, $var2, $web, $var3, $phone,$var4, $kw,$var5, $desc) = fgetcsv($fp, 4096); echo '<tr><td>'; echo $cat. '</td><td>' . $name . '</td><td><a href="http://www.' . $web .'" target="_blank">' .$web.'</a></td><td>'.$phone.'</td><td>'.$kw.'</td><td>'.$desc.'</td>' ; echo '</td></tr>'; } fclose($file_handle); show_source(__FILE__); ?> First thing you will probably notice is the extraneous vars within the list(). this is because of how the excel spreadsheet/csv file: Category,,Company Name,,Website,,Phone,,Keywords,,Description ,,,,,,,,,, Clothes,,4imprint,,4imprint.com,,877-466-7746,,"polos, jackets, coats, workwear, sweatshirts, hoodies, long sleeve, pullovers, t-shirts, tees, tshirts,",,An embroidery and apparel company based in Wisconsin. ,,Apollo Embroidery,,apolloemb.com,,1-800-982-2146,,"hats, caps, headwear, bags, totes, backpacks, blankets, embroidery",,An embroidery sales company based in California. One thing to note is that the last line starts with two commas as it is also listed within "Clothes" category. My concern is that I am going about the CSV output wrong. Should I be using a foreach loop instead of this list way? Should I first get rid of any unnecessary blank columns? Please advise any flaws you may find, improvements I can use so I can be ready to import this data to a MySQL DB.

    Read the article

  • FIlling a Java Bean tree structure from a csv flat file

    - by Clem
    Hi, I'm currently trying to construct a list of bean classes in Java from a flat description file formatted in csv. Concretely : Here is the structure of the csv file : MES_ID;GRP_PARENT_ID;GRP_ID;ATTR_ID M1 ; ;G1 ;A1 M1 ; ;G1 ;A2 M1 ;G1 ;G2 ;A3 M1 ;G1 ;G2 ;A4 M1 ;G2 ;G3 ;A5 M1 ; ;G4 ;A6 M1 ; ;G4 ;A7 M1 ; ;G4 ;A8 M2 ; ;G1 ;A1 M2 ; ;G1 ;A2 M2 ; ;G2 ;A3 M2 ; ;G2 ;A4 It corresponds to the hierarchical data structure : M1 ---G1 ------A1 ------A2 ------G2 ---------A3 ---------A4 ---------G3 ------------A5 ------G4 ---------A7 ---------A8 M2 ---G1 ------A1 ------A2 ---G2 ------A3 ------A4 Remarks : A message M can have an infinite number of groups G and attributes A A group G can have an infinite number of attributes and an infinite number of under-groups each of them having under-groups too That beeing said, I'm trying to read this flat csv decription to store it in this structure of beans : Map<String, MBean> messages = new HashMap<String, Mbean>(); == public class MBean { private String mes_id; private Map<String, GBean> groups; } public class GBean { private String grp_id; private Map<String, ABean> attributes; private Map<String, GBean> underGroups; } public class ABean { private String attr_id; } Reading the csv file sequentially is ok and I've been investigating how to use recursion to store the description data, but couldn't find a way. Thanks in advance for any of your algorithmic ideas. I hope it will put you in the mood of thinking about this ... I've to admit that I'm out of ideas :s

    Read the article

  • Import CSV to mysql

    - by 404error
    I have created a database and table. I have also created all the fields i will be needing. I have created 46 fields including 1 that is my ID for the row. The CSV doesn't contain the id field, nor does it contain the headers for the columns. I am new to all of this but have been trying to figure this out. I'm not on here being lazy asking for the answer, but looking for direction. I'm trying to figure out how to import the CSV but have it start importing data starting at the 2nd field, since I'm hoping the auto_increment will fill in the ID field, which is the first field i created. I tried these instructions with now luck. Can anyone offer some insight? your cvs file's column name must match your table column name browse your required .csv file select CSV using LOAD DATA options Check box 'ON' for Replace table data with file in Fields terminated by box type , in Fields enclosed by box " in Fields escaped by box \ in Lines terminated by box auto in Column names box type column name seperated by , like column1,column2,column3 10 check box ON for Use LOCAL keyword.

    Read the article

  • Python csv reader acting weird

    - by PylonsN00b
    So OK if I run this wrong code: csvReader1 = csv.reader(file('new_categories.csv', "rU"), delimiter=',') for row1 in csvReader1: print row1[0] print row1[8] category_sku = str(row[8]) if category_sku == sku: classifications["Craft"] = row[0] classifications["Theme"] = row[1] I get: Knitting 391 Traceback (most recent call last): File "upload_all_inventory_ebay.py", line 403, in <module> inventory_item_list = get_item_list(product) File "upload_all_inventory_ebay.py", line 294, in get_item_list category_sku = str(row[8]) NameError: global name 'row' is not defined Where Knitting and 391 are exactly right, of course I need to refer to row[8] as row1[8]...k so I do this: csvReader1 = csv.reader(file('new_categories.csv', "rU"), delimiter=',') for row1 in csvReader1: print row1[0] print row1[8] category_sku = str(row1[8]) if category_sku == sku: classifications["Craft"] = row1[0] classifications["Theme"] = row1[1] And I get this: ........... Crochet 107452 Knitting 107454 Knitting 107455 Knitting 107456 Knitting 107457 Crochet 108200 Crochet 108201 Crochet 108205 Crochet 108213 Crochet 108214 Crochet 108217 108432 Quilt 108451 108482 108488 Scrapbooking 108711 Knitting 122363 Needlework Beading Crafts & Decorating Crochet Crochet Crochet Traceback (most recent call last): File "upload_all_inventory_ebay.py", line 403, in <module> inventory_item_list = get_item_list(product) File "upload_all_inventory_ebay.py", line 292, in get_item_list print row1[0] IndexError: list index out of range Where the output you see there is every effing thing in column 0 and column 1 !!!!!!!!!! Why? And WHY is row1[0] out of range if it wasn't before. YAY Fridays!

    Read the article

  • Loading a CSV file using jQuery GET returns the header but no data

    - by Cees Meijer
    When reading a CSV file from a server using the jQuery 'GET' function I do not get any data. When I look at the code using FireBug I can see the GET request is sent and the return value is '200 OK'. Also I see that the header is returned correctly so the request is definitely made, and data is returned. This is also what I see in Wireshark. Here I see the complete contents of the CSV file is returned as a standard HTTP response. But the actual data is not there in my script. Firebug shows an empty response and the 'success' function is never called. What could be wrong ? <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>New Web Project</title> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <script src="jquery.js" type="text/javascript" charset="utf-8"></script> <script type="text/javascript"> var csvData; $(document).ready(function() { $("#btnGET").click(function() { csvData = $.ajax({ type: "GET", url: "http://www.mywebsite.com/data/sample_file.csv", dataType: "text/csv", success: function () { alert("done!"+ csvData.getAllResponseHeaders()) } }); }); }) </script> </head> <body> <h1>New Web Project Page</h1> <button id="btnGET">GET Data</button> </body> </html>

    Read the article

  • PHP generating csv not sending correct new line feeds

    - by sjw
    I have a script that generates a csv file using the following code: header('Content-type: text/csv'); header('Content-Disposition: attachment; filename="'.date("Ymdhis").'.csv"'); print $content; The $content variable simply contains lines with fields separated by commas and then finalised with ."\n"; to generate a new line. When I open the file in csv it looks fine however, when I try to use the file to import into an external program (MYOB) it does not recognise the End Of Line (\n) character and assumes one long line of text. When I view the contents of the file in notepad, the end of line character (\n) is a small rectangle box which looks like the character code 0x7F. If I open the file and re-save it in excel, it removes this character and replaces it with a proper end of line character and I can import the file. What character do I need to be generating in PHP so that notepad recognises it as a valid End Of Line character? (\n) obviously doesn't do the job.

    Read the article

  • Stop writing blank line at the end of CSV file (using MATLAB)

    - by Grant M.
    Hello all ... I'm using MATLAB to open a batch of CSV files containing column headers and data (using the 'importdata' function), then I manipulate the data a bit and write the headers and data to new CSV files using the 'dlmwrite' function. I'm using the '-append' and 'newline' attributes of 'dlmwrite' to add each line of text/data on a new line. Each of my new CSV files has a blank line at the end, whereas this blank line was not there before when I read in the data ... and I'm not using 'newline' on my final call of 'dlmwrite'. Does anyone know how I can keep from writing this blank line to the end of my CSV files? Thanks for your help, Grant EDITED 5/18/10 1:35PM CST - Added information about code and text file per request ... you'll notice after performing the procedure below that there appears to be a carriage return at the end of the last line in the new text file. Consider a text file named 'textfile.txt' that looks like this: Column1, Column2, Column3, Column4, Column 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5 Here's a sample of the code I am using: % import data importedData = importdata('textfile.txt'); % manipulate data importedData.data(:,1) = 100; % store column headers into single comma-delimited % character array (for easy writing later) columnHeaders = importedData.textdata{1}; for counter = 2:size(importedData.textdata,2) columnHeaders = horzcat(columnHeaders,',',importedData.textdata{counter}); end % write column headers to new file dlmwrite('textfile_updated.txt',columnHeaders,'Delimiter','','newline','pc') % append all but last line of data to new file for dataCounter = 1:(size(importedData.data,2)-1) dlmwrite('textfile_updated.txt',importedData.data(dataCounter,:),'Delimiter',',','newline','pc','-append') end % append last line of data to new file, not % creating new line at end dlmwrite('textfile_updated.txt',importedData.data(end,:),'Delimiter',',','-append')

    Read the article

  • append text to lines in a CSV file

    - by MichaelMcCabe
    This question seems to have been asked a million times around the web, but I cannot find an answer which will work for me. Basically, I have a CSV file which has a number of columns (say two). The program goes through each row in the CSV file, taking the first column value, then asks the user for the value to be placed in the second column. This is done on a handheld running Windows 6. I am developing using C#. It seems a simple thing to do. But I cant seem to add text to a line. I cant use OleDb, as System.Data.Oledb isnt in the .Net version I am using. I could use another CSV file, and when they complete each line, it writes it to another CSV file. But the problems with that are - The file thats produced at the end needs to contain EVERY line (so what if they pull the batterys out half way). And what if they go back, to continue doing this another time, how will the program know where to start back from.

    Read the article

  • Speed up csv export when using php from mysql database query

    - by John
    Ok, so i've got a web system (built on codeigniter & running on mysql) that allows people to query a database of postal address data by making selections in a series of forms until they arrive at the selection that want, pretty standard stuff. They can then buy that information and download it via that system. The queries run very fast, but when it comes to applying that query to the database,and exporting it to csv, once the datasets get to around the 30,000 record mark (each row has around 40 columns of which about 20 are all populated with on average 20 chars of data per cell) it can take 5 or so minutes to export to csv. So, my question is, what is the main cause for the slowness? Is it that the resultset of data from the query is so large, that it is running into memory issues? Therefore should i allow much more memory to the process? Or, is there a much more efficient way of exporting to csv from a mysql query that i'm not doing? Should i save the contents of the query to a temp table and simply export the temp table to csv? Or am i going about this all wrong? Also, is the fact that i'm using Codeigniters Active Record for this prohibitive due to the way that it stores the resultset? Any advice is welcome! Thank you for reading!

    Read the article

  • Bash on Snow Leopard doesn't obey terminal colours

    - by karbassi
    With the new version of Snow Leopard, OSX upgraded the bash version to GNU bash, version 3.2.48(1)-release (x86_64-apple-darwin10.0). Now, my .bashrc sets the following settings: # Colors export TERM=xterm-color export GREP_OPTIONS='--color=auto' GREP_COLOR='1;32' export CLICOLOR=1 export LSCOLORS=ExGxFxDxCxHxHxCbCeEbEb # Setup some colors to use later in interactive shell or scripts export COLOR_NC='\e[0m' # No Color export COLOR_WHITE='\e[1;37m' export COLOR_BLACK='\e[0;30m' export COLOR_BLUE='\e[0;34m' export COLOR_LIGHT_BLUE='\e[1;34m' export COLOR_GREEN='\e[0;32m' export COLOR_LIGHT_GREEN='\e[1;32m' export COLOR_CYAN='\e[0;36m' export COLOR_LIGHT_CYAN='\e[1;36m' export COLOR_RED='\e[0;31m' export COLOR_LIGHT_RED='\e[1;31m' export COLOR_PURPLE='\e[0;35m' export COLOR_LIGHT_PURPLE='\e[1;35m' export COLOR_BROWN='\e[0;33m' export COLOR_YELLOW='\e[1;33m' export COLOR_GRAY='\e[1;30m' export COLOR_LIGHT_GRAY='\e[0;37m' The colours are used later on for output. This used to work in previous version of OSX but not my output is as such: Some ideas that have not worked. Switching Terminal.app from 64-bit to 32-bit.

    Read the article

  • Problems to export java home and to find or create .bashrc in Mac OS 10.6

    - by casiopea
    Hello, I need to install a program for my studies and, this program, need java to run. When I try to perform the installation say that cannot find the JDK; since the JDK is already installed by default in mac, the problem is export the java home. I´ve try a lot and I cant do it! I know that I have to add a line in a .bashrc file (or .profile, or .bash_profile) I´ve created all those files, at different times but nothing... I´m a new mac user, but I use Linux too and I dont know what happened, I just need to export java home to perform my work... and is really necessary for me to add environment variables too. Thanks for your help

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >