Search Results

Search found 101795 results on 4072 pages for 'encrypting file system'.

Page 274/4072 | < Previous Page | 270 271 272 273 274 275 276 277 278 279 280 281  | Next Page >

  • Update MySQL table using data from a text file through Java

    - by Karthi Karthi
    I have a text file with four lines, each line contains comma separated values like below file My file is: Raj,[email protected],123455 kumar,[email protected],23453 shilpa,[email protected],765468 suraj,[email protected],876567 and I have a MySQL table which contains four fields firstname lastname email phno ---------- ---------- --------- -------- Raj babu [email protected] 2343245 kumar selva [email protected] 23453 shilpa murali [email protected] 765468 suraj abd [email protected] 876567 Now I want to update my table using the data in the above text file through Java. I have tried using bufferedReader to read from the file and used split method using comma as delimiter and stored it in array. But it is not working. Any help appreciated. This is what I have tried so far void readingFile() { try { File f1 = new File("TestFile.txt"); FileReader fr = new FileReader(f1); BufferedReader br = new BufferedReader(fr); String strln = null; strln = br.readLine(); while((strln=br.readLine())!=null) { // System.out.println(strln); arr = strln.split(","); strfirstname = arr[0]; strlastname = arr[1]; stremail = arr[2]; strphno = arr[3]; System.out.println(strfirstname + " " + strlastname + " " + stremail +" "+ strphno); } // for(String i : arr) // { // } br.close(); fr.close(); } catch(IOException e) { System.out.println("Cannot read from File." + e); } try { st = conn.createStatement(); String query = "update sampledb set email = stremail,phno =strphno where firstname = strfirstname "; st.executeUpdate(query); st.close(); System.out.println("sampledb Table successfully updated."); } catch(Exception e3) { System.out.println("Unable to Update sampledb table. " + e3); } } and the output i got is: Ganesh Pandiyan [email protected] 9591982389 Dass Jeyan [email protected] 9689523645 Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 1 Gowtham Selvan [email protected] 9894189423 at TemporaryPackages.FileReadAndUpdateTable.readingFile(FileReadAndUpdateTable.java:35) at TemporaryPackages.FileReadAndUpdateTable.main(FileReadAndUpdateTable.java:72) Java Result: 1 @varadaraj: This is the code of yours.... String stremail,strphno,strfirstname,strlastname; // String[] arr; Connection conn; Statement st; void readingFile() { try { BufferedReader bReader= new BufferedReader(new FileReader("TestFile.txt")); String fileValues; while ((fileValues = bReader.readLine()) != null) { String[] values=fileValues .split(","); strfirstname = values[0]; // strlastname = values[1]; stremail = values[1]; strphno = values[2]; System.out.println(strfirstname + " " + strlastname + " " + stremail +" "+ strphno); } bReader.close(); } catch (IOException e) { System.out.println("File Read Error"); } // for(String i : arr) // { // } try { st = conn.createStatement(); String query = "update sampledb set email = stremail,phno =strphno where firstname = strfirstname "; st.executeUpdate(query); st.close(); System.out.println("sampledb Table successfully updated."); } catch(Exception e3) { System.out.println("Unable to Update sampledb table. " + e3); } }

    Read the article

  • Infinite sharing system (PHP/MySQLi)

    - by Toine Lille
    I'm working on a discount system for whichever customer shares a product and brings in new customers. Each unique visit = $0.05 off, each new customer = $0.50 off (it's a cheap product so yeah, no big numbers). When a new customer shares the site, the customer initially responsible for the new customer (if any) will get half of the new customer's discount as well. The initial customer would get a fourth for the next level and the new customer half of that, etc, creating a tree or pyramid that way that could be infinite. Initial customer ($1.35 discount: 2 new+3 visits + half of 1 new+2 visits) Visitor ($0) Visitor ($0) New customer ($0.60) Visitor ($0) Visitor ($0) Newer customer ($0) New customer ($0) Visitor ($0) The customers are saved along with their IP addresses (bin2hex(inet_pton)) in a database table (customers) with info like a unique id, e-mail address and first date/time the purchased a product (= time of registration). The shares are saved in a separate table within the same database (sharing). Each unique IP addresses that visits the site creates a new row featuring the IP address (also saved as bin2hex(inet_pton)), the id of the customer who shared it and the date/time of the visit. Sharing goes via URL, featuring a GET element containing the customer's id. Visits and new customers overlap, as visits will always occur before the new customer does. That's fine. The date/times are used just to make it a little more secure (I also use the IP along with cookies to see if people cheat the system). If an IP is already in the sharing or customer tables, it does not count and will not create a new entry. Now the problem is, how to make the infinity happen and apply the different values to it? That's all I'd need to know. It needs to calculate the discount for each customer separately, but also allow for monitoring altogether (though that's just a matter of passing all ID's through it). I figured I'd start (after the database connection) with $stmt = $con->prepare('SELECT ip,datetime FROM sharing WHERE sender=?'); $stmt->bind_param('i',$customerid); $stmt->execute(); $stmt->store_result(); $discount = $discount + ($stmt->num_rows * 0.05); $stmt->bind_result($ip,$timeofsharing); to translate all the visits to $0.05 of discount each. To check for the new customers that came from these visits, I wrote the following: while ($sql->fetch()) { $stmt2 = $con->prepare("SELECT datetime FROM users WHERE ip=?"); $stmt2->bind_param('s',$ip); $stmt2->execute(); $stmt2->store_result(); $stmt2->bind_result($timeofpurchase); Followed by a little more security comparing the datetimes: while ($stmt2->fetch()) { if (strtotime($timeofpurchase) < strtotime($timeofsharing)) { $discount = $discount + $0.50; } But this is just for the initial customer's direct results. If I'd want to check for the next level, I'd basically have to put the exact same check and loop in itself, checking each new customer the initial customer they brought to the site, and then for the next level again to check all of the newer customers, etc, etc. What to do? / Where to go? / What would be the correct practice for this? Thanks!

    Read the article

  • What happened to Alan Cooper's Unified File Model?

    - by PAUL Mansour
    For a long time Alan Cooper (in the 3 versions of his book "About Face") has been promoting a "unified file model" to, among other things, dispense with what he calls the most idiotic message box ever invented - the one the pops up when hit the close button on an app or form saying "Do you want to discard your changes?" I like the idea and his arguments, but also have the knee-jerk reaction against it that most seasoned programmers and users have. While Cooper's book seems quite popular and respected, there is remarkably little discussion of this particular issue on the Web that I can find. Petter Hesselberg, the author of "Programming Industrial Strength Windows" mentions it but that seems about it. I have an opportunity to implement this in the (desktop) project I am working on, but face resistance by customers and co-workers, who are of course familiar with the MS Word and Excel way of doing things. I'm in a position to override their objections, but am not sure if I should. My questions are: Are there any good discussions of this that I have failed to find? Is anyone doing this in their apps? Is it a good idea that it is unfortunately not practical to implement until, say, Microsoft does it?

    Read the article

  • Error:couldn't read file-Kernel panic

    - by Thanos
    I have just installed ubuntu 12.04.1. To be honest I had to run installation several times until it was finished fine. When I finally managed to install it properly, I power on the laptop and the grub shoed up! I selected ubuntu generic. It takes some time to load and when it does I get an error message stating that error: couldn't read file Press any key to continue If I press any button nothing happens. If a leave it there, in a short while there is a black screen loading which gives some weird messages [0.946710] Kernel panic - not syncing: VFS: Unable to mount root fs on unknown-block (0,0) [0.946755] Pid: 1, comm: swapper/0 Not tainted 3.2.0-29-generic #46-Ubuntu [0.946792] Call Trace: [0.946831] [<ffffffff81640ec8>] panic+0x91/0x1a4 [0.946869] [<ffffffff81cfc01e>] mount_block_root+0xdc/0x18e [0.946909] [<ffffffff81002930>] ? populate_rootfs_wait+0x300/0x9d0 [0.946947] [<ffffffff81cfc257>] mount_root+0x54/0x59 [0.946982] [<ffffffff81cfcec9>] prepare_namespace+0x16d/0x1a6 [0.947019] [<ffffffff81cfbd63>] kernel_init+0x153/0x158 [0.947094] [<ffffffff81cfbc10>] ? start_kernel+0x3bd/0x3bd [0.947129] [<ffffffff81664030>] ? gs_change+0x13/0x13 The thing is that the laptop isn't mine. A friend tried to dual boot ubuntu alongside windows 7 but he didn't succeed. Ubuntu option was in grub, but when you tried to boot it rebooted from the start. So from a Live CD I erased ubuntu, started windows to check if something went wrong, and fortunately everything was OK. Windows started normally! So I tried to install ubuntu. Before installation was completed the installer crashed! I was afraid that he would lost windows, something that was true... At that point I tried to install windows but whichever distro(XP, 7{home, proffessional, ultimate},8) I tried it could never reach the end. So I tried to reinstall ubuntu but I was facing those weird messages. What can I do to move on? ______________________________________________________________________ EDIT1: I tried to check and fix(if possible) with GParted it took a lot of hours,although gparted displays only 01:14, I restarted the system and now I get not exactly the same messages. Numbers in braces [ ] are different [0.818189] Kernel panic - not syncing: VFS: Unable to mount root fs on unknown-block (0,0) [0.818235] Pid: 1, comm: swapper/0 Not tainted 3.2.0-29-generic #46-Ubuntu [0.818272] Call Trace: [0.818312] [<ffffffff81640ec8>] panic+0x91/0x1a4 [0.818351] [<ffffffff81cfc01e>] mount_block_root+0xdc/0x18e [0.818391] [<ffffffff81002930>] ? populate_rootfs_wait+0x300/0x9d0 [0.818428] [<ffffffff81cfc257>] mount_root+0x54/0x59 [0.818464] [<ffffffff81cfcec9>] prepare_namespace+0x16d/0x1a6 [0.818501] [<ffffffff81cfbd63>] kernel_init+0x153/0x158 [0.818574] [<ffffffff81cfbc10>] ? start_kernel+0x3bd/0x3bd [0.818610] [<ffffffff81664030>] ? gs_change+0x13/0x13 What on earth is going on? ______________________________________________________________________ EDIT2: I forgot to mention that my friend gave a punch to his laptop during a game. After that his cooler became to make a weird noise so I checked and it is a bit tortuous but it is working. What I beleive must be wrong is that his HDD makes a weird noise while trying to load ubuntu, which means he might need a new HDD. Could that be true?

    Read the article

  • How to speed up file transfer to/from Ubuntu Server 11.10 (wifi)

    - by Alexander
    I've been searching AU & elsewhere for the last day and a half. Haven't found an answer so I joined AU to ask for help. I'm hoping someone can point me in the right direction. Ubuntu Server 11.10 Samba VSFTPD Windows 7 PC 2 MacBook Pro - Snow Leopard/Lion 1 iMac - Lion Wireless LAN using DLink DIR-655 Link Speed: 195 Mbit/s on Mac - 54Mbps on Windows ISP Connection: Cable - 20 down/3 up No Domain Controller. All machines are members of the same workgroup. No matter how I connect I can't get better than about 700K transfer rate up/down. Mac/PC, SMB/ftp, Domain Name/Local IP I've tried different user accounts and using different folders, different volumes on the server. Nothing seems to make a difference. 700K up/down. Period. Any suggestions would be greatly appreciated. Thanks, Alexander EDIT: Using sftp now and uploading seems to peak at 980k. After about 5 minutes into a 650MB file, downloading is at 1072k and climbing about 500b/s every ten seconds. If any of that matters... I was expecting a lot faster than 1Mb tx rate. Am I off base here? EDIT: From all I've read so far, perhaps the speed isn't that bad. I only installed Ubuntu out of boredom this past weekend. The trouble is, I like it. Guess it's time to ditch the wifi and run some Cat 5.

    Read the article

  • How to Recover that Photo, Picture or File You Deleted Accidentally

    - by The Geek
    Have you ever accidentally deleted a photo on your camera, computer, USB drive, or anywhere else? What you might not know is that you can usually restore those pictures—even from your camera’s memory stick. Windows tries to prevent you from making a big mistake by providing the Recycle Bin, where deleted files hang around for a while—but unfortunately it doesn’t work for external USB drives, USB flash drives, memory sticks, or mapped drives. Luckily there’s another way to recover deleted files. Note: we originally wrote this article a year ago, but we’ve received this question so many times from readers, friends, and families that we’ve polished it up and are republishing it for everybody. So far, everybody has reported success! Latest Features How-To Geek ETC How to Recover that Photo, Picture or File You Deleted Accidentally How To Colorize Black and White Vintage Photographs in Photoshop How To Get SSH Command-Line Access to Windows 7 Using Cygwin The How-To Geek Video Guide to Using Windows 7 Speech Recognition How To Create Your Own Custom ASCII Art from Any Image How To Process Camera Raw Without Paying for Adobe Photoshop What is the Internet? From the Today Show January 1994 [Historical Video] Take Screenshots and Edit Them in Chrome and Iron Using Aviary Screen Capture Run Android 3.0 on a Hacked Nook Google Art Project Takes You Inside World Famous Museums Emerald Waves and Moody Skies Wallpaper Change Your MAC Address to Avoid Free Internet Restrictions

    Read the article

  • Error installing avogadro with CMake 'lconvert: could not exec No such file or directory'

    - by Orr22
    I'm brand new in ubuntu. I'm trying to install Avogadro. The program need the following packages, which I could install: CMake - OpenBabel 2.3.2 - Qt4 - Git - Eigen2. Here it is the recepy to install the : cd $HOME/src git clone git://github.com/cryos/avogadro.git mkdir -p $HOME/build/avogadro cd $HOME/build/avogadro cmake $HOME/src/avogadro make -j2 sudo make install It was unable to compile, but when I skipped the 'git clone' step it seemed to work just fine. After several stops during the CMake compiling process (software actualizations, get Doxygen, get flex, get bison) I was able to compile. But when I introduce the 'make -j2' command the installation stops as follows: Orr22@javi-87:~/build_avogadro$ make -j2 [ 0%] Built target elementcolor [ 0%] Built target bsdyengine [ 2%] Built target spglib [ 3%] Built target navigatetool [ 4%] Built target tubegen [ 4%] Generating libavogadro_hu.qm [ 6%] Built target OpenQube [ 6%] Generating moc_animation.cxx lconvert: could not exec '/usr/lib/i386-linux-gnu/qt5/bin/lconvert': No such file or directory make[2]: *** [libavogadro/src/libavogadro_hu.qm] Error 1 make[2]: *** Se espera a que terminen otras tareas.... make[1]: *** [libavogadro/src/CMakeFiles/avogadro.dir/all] Error 2 make: *** [all] Error 2 Any suggestions to proceed? Thanks in advance, Orr22

    Read the article

  • Restore audio settings - cannot open mixer: No such file or directory

    - by Alfred M.
    The internal speaker of my laptop never functionned under Ubuntu. I tried to follow indication on the web and now the jack audio does not work either. The graphic interface for audio management now displays a 'dummy output' instead of the three possible outputs I used to have (one of them was working for the jack output). In a terminal alsamixer raises an error: cannot open mixer: No such file or directory I did try to remove and reinstall alsa-utils but it did not change anything. This happened after a failed atempt to install alsa-driver-linuxant_1.0.23.1_all.deb from here. My sound card seems to be not recognised anymore. After reboot I have no more the sound icon in menu bar the upper right corner. I think I have removed my sound card driver. Indeed, the command sudo lshw -class multimedia indicated audi device as unclaimed. Any idea how I could revert to a better situation (that is jack support and alsa working)? EDIT: The command lspci -nnk | grep -iEA3 audio gives lspci -nnk | grep -iEA3 audio 00:1b.0 Audio device [0403]: Intel Corporation 82801I (ICH9 Family) HD Audio Controller [8086:293e] (rev 03) Subsystem: ASUSTeK Computer Inc. Device [1043:1893] 00:1c.0 PCI bridge [0604]: Intel Corporation 82801I (ICH9 Family) PCI Express Port 1 [8086:2940] (rev 03)

    Read the article

  • Unable to boot Windows after installing Ubuntu 12.04 - error: invalid efi file path

    - by user113350
    I have a Laptop (ASUS X310A, I installed Ubuntu 12.04 to be side by side with Windows 7 but I seem to have gotten a problem with booting Windows 7. I used the Boot Repair twice with no results. Boot-Repair info: http://paste.ubuntu.com/1417623/ The error I get when starting Windows 7 from GRUB is: error: invalid efi file path In Boot Manager or Menu, I have 3 options now: 2x for Ubuntu (maybe cause I did boot-repair twice) 1x Windows boot manager (If I boot this it opens "ASUS Preload Wizard", it gives me the option to re-install windows losing all previous data -) When I was making the partition before installing Ubuntu, I made the new partition by making sda4 smaller and adding ext4 mounted: "\" and adding a swap area. Installed it and it didn't work, nothing worked. So i booted Ubuntu from the USB again and deleted the partitions I made and decided to make sda3 smaller and making the partitions but this time it gave me the option that I could mount sda3 on "\windows" or "\dos" I ignored it and didn't choose neither because the I know that it doesn't need to be mounted and proceeded to create what is now sda7 (ext4) and sda8 (swap area). It still didn't work so I booted from USB and did the first boot-repair, so I was able to boot Ubuntu now but not windows, but when I did it through my USB I was not able to update boot-repair, so i decided to redo the boot-repair from Ubuntu running on the Hardisk (fully updated) and it still didn't work. In GRUB this is what i see (when booting using Ubuntu as first option in Boot Menu): Ubuntu, with Linux 3.2.0-29-generic Ubuntu, with Linux 3.2.0-29-generic (recovery mode) Windows UEFI loader Windows Boot UEFI bootx64.efi.bkp Windows 7 (loader) (on /dev/sda3) Windows Recovery Environment (loader) (on /dev/sda5) I tried all the ones starting with "Windows" they all don't work Please help, Many Thanks

    Read the article

  • File permission issues after setting up an amazon ec2 instance

    - by Pardoner
    I've set up an amazon ec2 instance and I'm have some file permission issues. I've created myself a new user and added myself to the following groups: adm:x:4:me,ubuntu www-data:x:33:me,www-data ssh:x:108:me admin:x:111:me ubuntu:x:1000:www-data,me me:x:1001:me but when I cd /var/www I can't do simple commands without doing sudo first. So I chmod -R www-data:www-data /var/www to ensure that I'm in the owning group but I still have to type sudo for everything. If I sudo su www-data it works fine. Since I'm in the www-data group shouldn't I have the same privilages as www-data? One strange thing I'm noticing is that when I ls -l it list the owner but not the group names. Could this possibly be part of the issue? Is is posible for a directory to not be part of a group? drwxr-xr-x 4 www-data 4.0K Oct 24 16:39 . drwxr-xr-x 14 root 4.0K Oct 10 16:58 .. drwxrwxr-x 9 www-data 4.0K Oct 23 04:03 admin.mywebsite.com drwxrwxr-x 2 www-data 4.0K Oct 4 00:29 mywebsite.com drwxrwxr-x 9 www-data 4.0K Oct 23 04:03 staging.mywebsite.com

    Read the article

  • JSCompress fails to compress my js file - why?

    - by Renso
    Issue: You use the online compression utility jscompress.com to compress your js file but it fails with an error. Why this may be happening and how to fix it. Possible causes: Apparently not using open and closing curly brackets in an IF statement would cause this. Well turns out this is not the case. Look at the following example and see if you can figure out what the issue is :-)   function SetupDeliveredVPRecontactNotes($item, id) {     var theData;     $.ajax({         data: { deliveredVPId: id },         url: $('#ajaxGetDeliveredVPRecontactNotesUrl').val(),         type: "GET",         async: false,         dataType: "html",         success: function(data, result) {             $item.empty();             var input = '<textarea class="recontactNote" rows="4" name="DeliveredVPRecontactNotes_' + id + '" id="DeliveredVPRecontactNotes_' + id + '" cols="115">' + data + '</textarea>';             $item.append(input);             theData = data;         },         error: function(XMLHttpRequest, textStatus, errorThrown) {             $item.empty();             alert("An error occurred: The operation to retrieve the DeliveredVP's Recontact Notes has failed");         }     });                  //ajax     return theData; }     Solution: The name of the method/function is the same as the message in the ALERT message when the spaces are removed: " DeliveredVP Recontact Notes" becomes " DeliveredVPRecontactNotes" and mathes that of the function. So I changed it to " DeliveredVP's Recontact Notes"

    Read the article

  • File doesn't exist when trying to change permissions following the avasys image scan manual

    - by Howard Graham
    I was finally able to connect to avasys.jp and downloaded and installed iscan_2.28.1-3.ltdl7_amd64.deb iscan-data_1.13.0-1_all.deb. The programs appeared to install correctly. I then ran sane-find-scanner and got back: found USB scanner (vendor=0x04b8, product=0x012d) at libusb:001:003 I then ran lsusb and got back: Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 001 Device 003: ID 04b8:012d Seiko Epson Corp. Perfection V10/V100 (GT-S600/F650) Bus 001 Device 004: ID 03f0:4817 Hewlett-Packard Bus 002 Device 002: ID 093a:2510 Pixart Imaging, Inc. Optical Mouse the avasys image scan manual instructed me to run chmod 0666 /proc/bus/usb/001/003 which returned chmod: cannot access `/proc/bus/usb/001/003': No such file or directory In 12.04, no such directory exists. 12.04 appears to deal with USB in another way. What must I do to get the usb port 001/003 recognized by xsane and sane as the port where the scanner can be located? What must I do to continue installing the scanner?

    Read the article

  • Understanding IDAT chunk of PNG file format

    - by DRapp
    From the sample image below, I have a border in yellow just for display purposes only. The actual .png file is a simple black/white image 3 pixels by 3 pixels. I was originally thinking to try as a 2x2, but that would not help trying to interpret low/hi vs hi/low drawing stream. At least this way, I would have two black, one white from the top, or one white, two black from the bottom.. So I read the chunks of data, get to the IDAT chunk, decode that (zlib) and come up with 12 bytes as follows 00 20 00 40 00 80 So, my question, how does the above get broken down into the 3x3 black and white sample... Also, it is saved in palette format and properly recognizes the bit depth of 1 and color palette of 2... color pallet[0] is RGBA all zeros. Palette1 has RGBA of 255, 255, 255, 0 I'll eventually get into the multiple other depth formats later, just wanted to start with what would expect to be the easiest. Part II. Any guidance on handling the other depth formats would help if anything special to be considered especially regarding alpha channel (which I am already looking for in the palette) that might trip me up.

    Read the article

  • configuration issue with respect to .htaccess file on ubuntu

    - by Registered User
    I am building an application tshirtshop I have following configuration in /etc/apache2/sites-enabled/tshirtshop <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /var/www/tshirtshop <Directory /var/www/tshirtshop> Options Indexes FollowSymLinks AllowOverride All Order allow,deny allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined </VirtualHost> and following in .htaccess file in location /var/www/tshirtshop/.htaccess <IfModule mod_rewrite.c> # Enable mod_rewrite RewriteEngine On # Specify the folder in which the application resides. # Use / if the application is in the root. RewriteBase /tshirtshop #RewriteBase / # Rewrite to correct domain to avoid canonicalization problems # RewriteCond %{HTTP_HOST} !^www\.example\.com # RewriteRule ^(.*)$ http://www.example.com/$1 [R=301,L] # Rewrite URLs ending in /index.php or /index.html to / RewriteCond %{THE_REQUEST} ^GET\ .*/index\.(php|html?)\ HTTP RewriteRule ^(.*)index\.(php|html?)$ $1 [R=301,L] # Rewrite category pages RewriteRule ^.*-d([0-9]+)/.*-c([0-9]+)/page-([0-9]+)/?$ index.php?DepartmentId=$1&CategoryId=$2&Page=$3 [L] RewriteRule ^.*-d([0-9]+)/.*-c([0-9]+)/?$ index.php?DepartmentId=$1&CategoryId=$2 [L] # Rewrite department pages RewriteRule ^.*-d([0-9]+)/page-([0-9]+)/?$ index.php?DepartmentId=$1&Page=$2 [L] RewriteRule ^.*-d([0-9]+)/?$ index.php?DepartmentId=$1 [L] # Rewrite subpages of the home page RewriteRule ^page-([0-9]+)/?$ index.php?Page=$1 [L] # Rewrite product details pages RewriteRule ^.*-p([0-9]+)/?$ index.php?ProductId=$1 [L] </IfModule> the site is working on localhost and is working as if there is no .htaccess rule specified i.e. if I were to view a page as http://localhost/tshirtshop/nature-d2 then I get a 404 Error but if I view the same page as http://localhost/tshirtshop/index.php?DepartmentId=2 then I can view it. sudo apache2ctl -M Loaded Modules: core_module (static) log_config_module (static) logio_module (static) mpm_prefork_module (static) http_module (static) so_module (static) alias_module (shared) auth_basic_module (shared) authn_file_module (shared) authz_default_module (shared) authz_groupfile_module (shared) authz_host_module (shared) authz_user_module (shared) autoindex_module (shared) cgi_module (shared) deflate_module (shared) dir_module (shared) env_module (shared) mime_module (shared) negotiation_module (shared) php5_module (shared) reqtimeout_module (shared) rewrite_module (shared) setenvif_module (shared) status_module (shared) Syntax OK What is the mistake if any one can point out in above configuration, or else I need to check any thing else?

    Read the article

  • how to create java zip archives with a max file size limit [closed]

    - by Marci Casvan
    I need to write an algorithm in java (for an android app) to read a folder containing more folders and each of those containing images and audio files so the structure is this: mainDir/subfolders/myFile1.jpg It must be in java, something like perl script is not an option. It would preferably be for the compressed archive in order to squeeze as many files as possible before mailing the zip. Just a normal zip (no jar). My problem is that I need to limit the size of the archive to 16mb and at runtime, create as many archives as needed to contain all my files from my main mainDir folder. I tried several examples from the net, I read the java documentation, but I can't manage to understand and put it all together the way I need it. Has someone done this before or has a link or an example for me? I resolved the reading of the files with a recursive method but I can't write the logic for the zip creation I'm open for suggestions or better, a working example. EDIT: FileNotFoundException (no such file or directory) this was my initial post at Stack Overflow. I've got an answer to it, but I can't set the size of the ZipEntry and the logic doesn't work and also when extracting the my files from the zip I get the compression method not supported error.

    Read the article

  • Adding an user to samba

    - by JustMaximumPower
    I'm trying to setup some samba shares in my home network on an Ubuntu 12.04 machine. Everything works fine for my user account (max) but I can not add any new user. Every time I try to add new user they can not use the shares. It's likely that the error is very basic to the concept of samba but please don't just tell me to read the docs. I've been trying that for about 2 weeks now. I've set up the server with my user max who can mount transfer and the share max. Than I added the user simon with sudo adduser --no-create-home --disabled-login --shell /bin/false simon because the user should not be able to ssh into the machine. I did an sudo smbpasswd -a simon and set an (samba) password for simon and added an share for simon. I also added simon to transferusers to give him access to the share transfer. But simon can't connect to transfer or simons. ---- output of testparam: ------- Load smb config files from /etc/samba/smb.conf rlimit_max: increasing rlimit_max (1024) to minimum Windows limit (16384) Processing section "[printers]" Processing section "[print$]" Processing section "[max]" Processing section "[simons]" Processing section "[transfer]" Loaded services file OK. Server role: ROLE_STANDALONE Press enter to see a dump of your service definitions [global] server string = %h server (Samba, Ubuntu) map to guest = Bad User obey pam restrictions = Yes pam password change = Yes passwd program = /usr/bin/passwd %u passwd chat = *Enter\snew\s*\spassword:* %n\n *Retype\snew\s*\spassword:* %n\n *password\supdated\ssuccessfully* . unix password sync = Yes syslog = 0 log file = /var/log/samba/log.%m max log size = 1000 dns proxy = No usershare allow guests = Yes panic action = /usr/share/samba/panic-action %d idmap config * : backend = tdb [printers] comment = All Printers path = /var/spool/samba create mask = 0700 printable = Yes print ok = Yes browseable = No [print$] comment = Printer Drivers path = /var/lib/samba/printers [max] comment = Privater share von Max path = /media/Main/max read only = No create mask = 0700 [simons] comment = Privater share von Simon path = /media/Main/simon read only = No create mask = 0700 [transfer] comment = Transferlaufwerk path = /media/Main/transfer read only = No create mask = 0755 ---- The files in /media/Main: ------ drwxrwxr-x 17 max max 4096 Oct 4 19:13 max/ drwx------ 5 simon max 4096 Aug 4 15:18 simon/ drwxrwxr-x 7 max transferusers 258048 Oct 1 22:55 transfer/

    Read the article

  • How to add an SSH user to my Ubuntu 12 server to upload PHP files

    - by user229209
    I have an Ubuntu 12 VPS and wanted to create a user account to upload and download my PHP code. So when logged in as root I created a user "chris" and then created a directory /var/www/chris I want "chris" to be able to upload and run files to the /var/www/chris directory. Permissions for the chris dir look like this: drwxrwxr-x 2 root chris 4096 Aug 20 03:35 chris As root I created a sample file called abc.php and put it in the chris dir. It worked fine when I test it in a browser. I logged in as chris and uploaded a file called 1234.php. That did not work. I just got a blank PHP page. The code was identical in both files. So it is not the code. The permissions now look like this: -rw-r--r-- 1 root chris 59 Aug 20 03:34 1234.php -rw-r--r-- 1 root root 49 Aug 20 03:21 abc.php How do I alow the "chris" user to upload files and get them to work?

    Read the article

  • Ubuntu rm not deleting files

    - by ILMV
    My colleague and I have been struggling with deleting a directory and its contents. We are working on a new version of our websites source code on Ubuntu 8.04 (dir: /var/www/websites), what we want to do is delete the websites directory and recreate it from a .tar backup we created a couple weeks ago. The purpose of this is so we can run our deployment procedure in a local environment before we do so on our live / public environment. We use this command: rm -r websites This deletes the directory and the files within it. The problem occurs when we un-tar our backup file and view the website we are getting files that don't exist in the .tar backup, in fact these files were only created a few days ago and should have been deleted. We delete the directory once more in the manner stated above, we then create a new websites directory using the mkdir command. Strangely at this stage the 'deleted files' do not come back, but if we unpack our .tar file the 'deleted files' appear again. Is there a way to ensure these files are deleted, or at least the pointers that associate them with said directory. Our .tar backup does not include these files We do not want to use the shred command We do not want to use 3rd party applications Solution should be functional via terminal (SSH) Many thanks! EDIT Er... we fixed it. Turns out the files that are reappearing are because of a link we have to another directory (outside the /var/www/websites), we were restoring the link but not deleting the files on the other end. D'oh! Many thanks for your help guys... friday afternoon syndrome :-)

    Read the article

  • How to transfer files via infrared on Linux?

    - by arielnmz
    I know this is a way too old technology but I've got some files inside a very old cellphone that I need to transfer to a very old computer. So far my Infrared USB device works well, it's detected by the machine (lsusb output): Bus 002 Device 002: ID 0df7:0620 Mobile Action Technology, Inc. MA-620 Infrared Adapter I've tried to send the file over MMS and even email (it lacks bluetooth, not to mention USB). But this cellphones's firmware doesn't let me attach the files. The file was originally transfered via IrDA, and it only has an internal memory (a whole 2 million bytes! whoa!). I found a package called irda-utils, but it seems that there are only two executables: irdaping and irdadump. I think the dump utility might do the job (which as far as I can see it's kind of a version of tcpdump but for IrDA), but I don't even know how to process the received frames. Could this question may be what I'm looking for? EDIT While reading through the Linux Infrared HOWTO I found about the OpenObex project, which may be what I'm looking for...

    Read the article

  • Why doesn't my symbolic link work?

    - by orokusaki
    I'm trying to better understand symbolic links... and not having very much luck. This is my actual shell output with username/host changed: username@host:~$ mkdir actual username@host:~$ mkdir proper username@host:~$ touch actual/file-1.txt username@host:~$ echo "file 1" > actual/file-1.txt username@host:~$ touch actual/file-2.txt username@host:~$ echo "file 2" > actual/file-2.txt username@host:~$ ln -s actual/file-1.txt actual/file-2.txt proper username@host:~$ # Now, try to use the files through their links username@host:~$ cat proper/file-1.txt cat: proper/file-1.txt: No such file or directory username@host:~$ cat proper/file-2.txt cat: proper/file-2.txt: No such file or directory username@host:~$ # Check that actual files do in fact exist username@host:~$ cat actual/file-1.txt file 1 username@host:~$ cat actual/file-2.txt file 2 username@host:~$ # Remove the links and go home :( username@host:~$ rm proper/file-1.txt username@host:~$ rm proper/file-2.txt I thought that a symbolic link was supposed to operate transparently, in the sense that you could operate on the file that it points to as if you were accessing the file directly (except of course in the case of rm where of course the link is simply removed).

    Read the article

  • Ubuntu: Getting rid of a mimetype entry

    - by Epaga
    I have a pesky mimetype entry that I can't seem to get rid of. Here is the current situation: xdg-mime query filetype myfile.mfe application/pesky Using assogiate I have found out the information about this mime type entry (but can't delete it there). I have the following 'pesky.xml' XML file which was used to create the mime type (as far as I can tell, since it exactly matches the entry in assogiate...): <?xml version='1.0'?> <mime-info xmlns='http://www.freedesktop.org/standard'> <mime-type type="application/pesky"> <comment>my pesky type</comment> <glob pattern="*.mfe"/> <magic priority="100"> <match type="string" offset="0" value="application/pesky"/> </magic> </mime-type> <mime-info> However, the following has no effect: sudo xdg-mime uninstall --mode system --novendor pesky.xml The file association remains. Any ideas?

    Read the article

  • download management

    - by Jonathan
    I download many files, usually 2 or 3 a day, often 10ish. Some of them are duplicates because I just can't be bothered to find the original in my downloads folder. I have previously tried DAP and used that to create a new subfolder for each day's download. yet I have found this insufficient as sometimes I wish to find files by name/file type or I have multiple parts of downloads over more than one day. Another problem I have found is zips/rars/etc after downloading them and extracting them I then have the zip and the folder. I like it like on a Mac where it automatically extracts the zip after it has been downloaded and removes the zip. What I'd like to be able to do is sort the downloads by date, but dynamically so they are just in the big downloads folder, but I can just press a button and it will show me all the files from a particular site, or from a particular day or by a certain file type. Is there any software that will do this? I use Chrome as a browser but also have Firefox and like that. Jonathan

    Read the article

  • 403 Forbidden error on Mac OSX - Apache and nginx

    - by tlianza
    Hi All, There are a million questions like this on Google, but I haven't found a solution to my problem. The default Apache install on my Mac is giving 403 Forbidden errors for everything (default directory, user home directory, virtual server, etc). After sifting through the config files, I figured I'd give nginx a try. Nginx serves files fine from it's home directory, but it won't serve files from a subfolder of my user directory. I've configured a simple virtual host, and requesting index.html returns a 403-forbidden. The error message in nginx's log file is pretty clear - it can't read the file: 2011/01/04 16:13:54 [error] 96440#0: *11 open() "/Users/me/Documents/workspace/mobile/index.html" failed (13: Permission denied), client: 127.0.0.1, server: local.test.com, request: "GET /index.html HTTP/1.1", host: "local.test.com" I've opened up this directory to everyone: drwxrwxrwx 6 me admin 204B Dec 31 20:49 mobile And all the files in it: $ ls -lah mobile/ total 24 drwxrwxrwx 6 me admin 204B Dec 31 20:49 . drwxr-xr-x 71 me me 2.4K Dec 31 20:41 .. -rw-r--r--@ 1 me me 6.0K Jan 2 18:58 .DS_Store -rwxrwxrwx 1 me admin 2.1K Jan 4 14:22 index.html drwxrwxrwx 5 me admin 170B Dec 31 20:45 nbproject drwxrwxrwx 5 me admin 170B Jan 2 18:58 script And yet, I cannot figure out why the nginx process cannot read index.html. It's running as the "nobody" user, but the permissions are set such that anyone can read them.

    Read the article

  • Compressing and copying large files on Windows Server?

    - by Aaron
    I've been having a hard time copying large database backups from the database server to a test box at another site. I'm open to any ideas that would help me get this database moved without having to resort to a USB hard drive and the mail. The database server is running Windows Server 2003 R2 Enterprise, 16 GB of RAM and two quad-core 3.0 GHz Xeon X5450s. Files are SQL Server 2005 backup files between 100 GB and 250 GB. The pipe is not the fastest and SQL Server backup files typically compress down to 10-40% of the original, so it made sense to me to compress the files first. I've tried a number of methods, including: gzip 1.2.4 (UnxUtils) and 1.3.12 (GnuWin) bzip2 1.0.1 (UnxUtils) and 1.0.5 (Cygwin) WinRAR 3.90 7-Zip 4.65 (7za.exe) I've attempted to use WinRAR and 7-Zip options for splitting into multiple segments. 7za.exe has worked well for me for database backups on another server, which has ~50 GB backups. I've also tried splitting the .BAK file first with various utilities and compressing the resulting segments. No joy with that approach either- no matter the tool I've tried, it ends up butting against the size of the file. Especially frustrating is that I've transferred files of similar size on Unix boxes without problems using rsync+ssh. Installing an SSH server is not an option for the situation I'm in, unfortunately. For example, this is how 7-Zip dies: H:\dbatmp>7za.exe a -t7z -v250m -mx3 h:\dbatmp\zip\db-20100419_1228.7z h:\dbatmp\db-20100419_1228.bak 7-Zip (A) 4.65 Copyright (c) 1999-2009 Igor Pavlov 2009-02-03 Scanning Creating archive h:\dbatmp\zip\db-20100419_1228.7z Compressing db-20100419_1228.bak System error: Unspecified error

    Read the article

  • Copying compressed files from Server 2008 R2 network share to XP client via VPN fails

    - by Dejan Janjuševic
    At the first sight the question looks similar to this one. I have experienced an odd behavior while trying to copy a certain file from Windows Server 2008 R2 network share to Windows XP Professional client via VPN. The VPN was set up using RRAS on the server machine. I will try to provide as much informations as possible in order to make the issue more clear. When trying to copy the compressed file sized ~2.5 MB (via Explorer or CMD, doesn't matter), the process stalls after some 20%, producing an error message after few seconds: Cannot copy filename: The specified network name is no longer available. If i start the command ping -t 192.168.2.1 (where the IP address specified belongs to the server) side by side with the copy command, I can clearly see that the ping command times out for few seconds as the copy process stalls. When this happens all network activities are frozen. After a few seconds, the network recovers, ping continues to run normally, however the copy process stands still before it displays the above error message. Copying other files (I tried 4-5 files), of which some are larger and some are smaller, succeeds. Seems to me that I can copy all uncompressed files. As soon as I try to copy an archive, the process freezes. Even a 707 KB large archive can't be copied. I can only reproduce this behavior on 2 machines, both Windows XP Professional, one is w/ SP2 and the other w/ SP3. Other XP clients don't have this problem, neither do Windows 7 clients. If I connect to the server using Remote Desktop Connection without using VPN from either of these 2 machines (using the same user account), I can copy anything I want normally, even these "problematic" files. Does anyone have any clue about what could possibly be going on?

    Read the article

< Previous Page | 270 271 272 273 274 275 276 277 278 279 280 281  | Next Page >