Search Results

Search found 62606 results on 2505 pages for 'sql files'.

Page 559/2505 | < Previous Page | 555 556 557 558 559 560 561 562 563 564 565 566  | Next Page >

  • Are Dynamic Prepared Statements Bad? (with php + mysqli)

    - by John
    I like the flexibility of Dynamic SQL and I like the security + improved performance of Prepared Statements. So what I really want is Dynamic Prepared Statements, which is troublesome to make because bind_param and bind_result accept "fixed" number of arguments. So I made use of an eval() statement to get around this problem. But I get the feeling this is a bad idea. Here's example code of what I mean // array of WHERE conditions $param = array('customer_id'=>1, 'qty'=>'2'); $stmt = $mysqli->stmt_init(); $types = ''; $bindParam = array(); $where = ''; $count = 0; // build the dynamic sql and param bind conditions foreach($param as $key=>$val) { $types .= 'i'; $bindParam[] = '$p'.$count.'=$param["'.$key.'"]'; $where .= "$key = ? AND "; $count++; } // prepare the query -- SELECT * FROM t1 WHERE customer_id = ? AND qty = ? $sql = "SELECT * FROM t1 WHERE ".substr($where, 0, strlen($where)-4); $stmt->prepare($sql); // assemble the bind_param command $command = '$stmt->bind_param($types, '.implode(', ', $bindParam).');'; // evaluate the command -- $stmt->bind_param($types,$p0=$param["customer_id"],$p1=$param["qty"]); eval($command); Is that last eval() statement a bad idea? I tried to avoid code injection by encapsulating values behind the variable name $param. Does anyone have an opinion or other suggestions? Are there issues I need to be aware of?

    Read the article

  • Accidentally deleted /opt/local/bin without backup. Any help?

    - by Aaron
    Hi all, I'm on a Mac OS X 10.5.8 I was recently uninstalling mysql5 from /opt/local/bin. I typed: rm -rf /opt/local/bin mysql* instead of rm -rf /opt/local/bin/mysql* This deleted my entire /opt/local/bin directory which puts me in a bit of a bind. Is there any way to recover these files? If not, I have a friend that is using a similar set of programs, would it be possible to use the contents of his folder? If I end up needing to re-install everything in this folder, what is the best way to go about doing this? Thanks in advance!

    Read the article

  • Windows 7 Sub-Folders hidden in "Program Files" directory

    - by ron tornambe
    I have Google searched for an hour now and I am confounded. I am using InnoSetup to install a .NET Winforms application that creates directories and folders on the fly. (I have set the folder options to display hidden files, folders...) Although the files that are added to "created" folders appear within the application, they do not show when using Windows Explorer or even when issuing a Dir from a command prompt. I have also modified the application to display (and delete) the contents of these (seemingly imaginary) folders, so I am sure they exist. What am I missing?

    Read the article

  • Google Drive desktop client not updating existing files from other users

    - by cqm
    I've looked around and there doesn't really seem to be any troubleshooting information for the Google Drive desktop client. It all assumes you are using Google Docs on the web. Anyway, my team is trying to use Google Drive like Dropbox, where multiple people are editing files shared amongst them through the desktop, such as images. Dropbox is really good at noticing when a checksum for a file is changed, and syncing it. Google Drive's desktop client seems not to do this at all. Google Drive desktop client seems to only sync newly created files and not giving any notification at all that there is a modified version, it will never sync it, even though going online and opening that file will show the modified version. Is there any way to fix this? and the answer has nothing to do with proxy or firewall configurations. Team is using computers running OSX and Windows.

    Read the article

  • Regex to allow all php files except one

    - by Tim
    Hi all, I have this regex that allow all php files : ^.*\.([Pp][Hh][Pp]) how can I exclude a specific file, for example test.php ? Thanks for your answer, Best regards [edit] I omit to say that it is a reg from a htaccess file, the /i doesn't seems to work, and the ? neither. [Edit2] the purpose is to grant access to authenticated users, except for one file that has to be allowed for everyone. So I've done : <Files ~ "^.*\.([Pp][Hh][Pp])$"> AuthUserFile /directory/.htpasswd AuthGroupFile /dev/null AuthName "Please log in ..." AuthType Basic require valid-user </Files> So, all php files require valid user. I would like to add an exception for a specific file, says test.php

    Read the article

  • Vantec NexStar NAS Encloser - Writing large files

    - by peter
    I have one of these 'Vantec NexStar LX - NST-475LX-BK' drive enclosures. It is a NAS device. When I write a file to the device using eSata, or a SMB share I cannot write files over 4GB. I think this is because the drive is formatted with FAT32. But when I access the device using FTP it doesn't matter. I can write files of any size. E.g. I wrote one on there last night which was 30GB. Does this make any sense? Why? I guess the most important thing for me is data integrity.

    Read the article

  • Change permission of files with the owner 'apache'

    - by Dotty
    Hay, i have some files on my server with the owner set to "apache", I'm not quite sure how this happened. Anyway, i need to change the permission of these files to 0777 so i can download/edit them. However i cannot. I'm using a 1and1 Linux server and use Plesk to administrate it. I have the ability to login via SSH. However, if i run chmod or chown i get a "permission denied" error, and if i try to sudo chmod or chown it says the command cannot be found. When i go to edit my domain details, i get this option Shell access to server with FTP user's credentials and have these options /bin/sh /bin/bash /sbin/nologin /bin/bash (chrooted) /bin/rbash Any idea's how i should go about changing the permissions or changing the owner? Thanks

    Read the article

  • Vagrant is creating files and folders in my project

    - by SERPRO
    Recently I updated Vagrant (v 1.6.3) and I noticed that in the folder of my project there are some new folders and files like: d20140610-11944-1j6n1cz/ d20140610-15421-1pkz3t8/ vagrant20140610-11944-p76ezc vagrant20140610-11944-p76ezc2 vagrant20140610-11944-yt3bhz vagrant20140610-11944-yt3bhz1 vagrant20140610-15421-mfqrig vagrant20140610-15421-mfqrig1 vagrant20140610-15421-y3r71a vagrant20140610-15421-y3r71a2 vagrant20140610-15421-y3r71a2.lock most of the files are empty, others have text like this: source "https://rubygems.org" source "http://gems.hashicorp.com" gem "vagrant", "= 1.6.3" group :plugins do gem "vagrant-login", nil, {} gem "vagrant-share", nil, {} end The directories have a file named config with this this info: BUNDLE_PATH: "/home/user/.vagrant.d/gems" Is this some kind of debug option? how can I disable it?

    Read the article

  • Missing drive space in Server 2003

    - by Tim Brigham
    I have two drives used for SQL backups which for the last week have been acting strange - the free space indicated by windows is far off from what windirstat, etc indicates. There should only be about 60 GB of drive space used and there is about 160. This would match the utilization if the two last backup files were still residing on disk. SQL server is 2000, OS Server 2003 x64. Running on a VMware 5.0 cluster. OSSEC and McAfee for this system shows clean. My current plan is to temporarily attach one of these drives this drive to another VM for analysis. Is there anything more I should be looking at? There were a lot of pages on the net when I was looking for documentation on this issue but I haven't found this case described. EDIT: Unfortunately even a full reboot did not clear this behavior. I also used process explorer to look for open file handles. No dice.

    Read the article

  • Ignoring generated files when using "Treat warnings as errors"

    - by krystan honour
    We have started a new project but also have this problem for an existing project. The problem is that when we compile with a warning level of 4 we also want to switch on 'Treat all warnings as errors' We are unable to do this at the moment because generated files (in particular reference.cs files) are missing things like XML comments and this generates a warning, we do not want to suppress the xml comment warnings totally out of all files just for specific types of files (namely generated code). I have thought of a way this could be achieved but am not sure if these are the best way to do this or indeed where to start :) My thinking is that we need to do something with T4 templates for the code that is generated such that it does fill in XML documentation for generated code. Does anyone have any ideas, currently I'm at well over 2k warnings (its a big project) :(

    Read the article

  • $_POST goes empty after adding a new input type "file"

    - by heldrida
    Hi, I'm not finding a way to understand and fix this and I've done a lot. I've got a script, wish is a simple form, that sends a file trough POST. The second file, process the info. By default, I give to the user a few fields, one of them being a input field of type "file" and there's also, a few "hidden" one's, that gives me values to work with on POST. I found that, when adding a new input of type "file", the $_POST returns array 0, even $_FILES returns nothing. I have no idea how to fix this, and it works just fine when keeping the default input box of type "file". This is the form http://pastie.org/872488 This only happens when: Exists! var_dump( $_POST ), or $_FILES, print_r(), etc Returns nothing. I've tryed to create a array on the input of type "files", like img_p_child[], but nothing. How to solve this ? Thanks for taking your time!

    Read the article

  • How do I distinguish files and folders on an FTP server

    - by soulmerge
    I want to list all files on an FTP server using PHP. According to RFC 959 the FTP command LIST is allowed to print arbitrary human-readable information on files/folders, which seems to make it impossible to determine the file type correctly. But how do other FTP clients manage to distinguish files and folders? Is there an unwritten standard or such?

    Read the article

  • Virtual hosting in Varnish with individual vcl files for configuration

    - by Michael Sørensen
    I wish to use varnish to put in front of an apache and a tomcat on the same server. Depending on the ip requested, it goes to a different backend. This works. Now for most of the sites the default varnish logic will work just fine. However for some specific sites I wish to use custom VCL code. I can test for host name and include config files for the specific domains, but this only works inside the individual methods recv etc. Is there a way to include a complete set of instructions, in one file, per domain, without having to manage separate files for subdomain_recv, subdomain_fetch etc? And preferably without running seperate instances of varnish. When I try to include a file on the "root level" of default.vcl, I get a compilation error. Best regards, Michael

    Read the article

  • Redirected site files for desktop downloader

    - by Jeje
    Hi, i temporarily change place for static files on site. But this files must have access from old URL, i've create a script that make's redirect to the right place, but this files are downloading by third-part program. The problem is that program ignoring redirect. I tryed to use permanent redirecting but no success.

    Read the article

  • Which are the RDBMS that minimize the server roundtrips? Which RDBMS are better (in this area) than

    - by user193655
    When the latency is high ("when pinging the server takes time") the server roundtrips make the difference. Now I don't want to focus on the roundtrips created in programming, but the roundtrips that occur "under the hood" in the DB engine, so the roundtrips that are 100% dependant on how the RDBMS is written itself. I have been told that FireBird has more roundtrips than MySQL. But this is the only information I know. I am currently supporting MS SQL but I'd like to change RDBMS (because I use Express Editions and in my scenario they are quite limiting from the performance point of view), so to make a wise choice I would like to include also this point into "my RDBMS comparison feature matrix" to understand which is the best RDBMS to choose as an alternative to MS SQL. So the bold sentence above would make me prefer MySQL to Firebird (for the roundtrips concept, not in general), but can anyone add informations? And MS SQL where is it located? Is someone able to "rank" the roundtrip performance of the main RDBMS, or at least: MS SQL, MySql, Postegresql, Firebird (I am not interested in Oracle since it is not free, and if I have to change I would change to a free RDBMS). Anyway MySql (as mentioned several times on stackoverflow) has a not clear future and a not 100% free license. So my final choice will probably dall on PostgreSQL or Firebird. Additional info: somehow you can answer my question by making a simple list like: MSSQL:3; MySQL:1; Firebird:2; Postgresql:2 (where 1 is good, 2 average, 3 bad). Of course if you can post some links where the roundtrips per RDBMSs are compared it would be great

    Read the article

  • Files slow to save sometimes after Ubuntu upgrade

    - by Matchu
    I haven't quite been able to track down why this happens sometimes in Ubuntu 10.04 and not other times. I'll go into gedit or OpenOffice.org and try to save files, and, during some sessions, it will take up to 10 seconds to save the file, sometimes causing the program to become unresponsive. But during these same sessions, the files sometimes save instantly. This didn't start happening until after the 10.04 (Lucid) upgrade. I suspect that something is reading all the changes I make, or that there's some other big file action going on, or something like that. I disabled Tracker a while back, before the upgrade, and don't see it under the settings - could it be back under a different name under Lucid? You probably don't already know the answer, but how can I go about finding the cause of this problem?

    Read the article

  • TransactionScope and Transactions

    - by Mike
    In my C# code I am using TransactionScope because I was told not to rely that my sql programmers will always use transactions and we are responsible and yada yada. Having said that It looks like TransactionScope object Rolls back before the SqlTransaction? Is that possible and if so what is the correct methodology for wrapping a TransactionScope in a transaction. Here is the sql test CREATE PROC ThrowError AS BEGIN TRANSACTION --SqlTransaction SELECT 1/0 IF @@ERROR<> 0 BEGIN ROLLBACK TRANSACTION --SqlTransaction RETURN -1 END ELSE BEGIN COMMIT TRANSACTION --SqlTransaction RETURN 0 END go DECLARE @RESULT INT EXEC @RESULT = ThrowError SELECT @RESULT And if I run this I get just the divide by 0 and return -1 Call from the C# code I get an extra error message Divide by zero error encountered. Transaction count after EXECUTE indicates that a COMMIT or ROLLBACK TRANSACTION tatement is missing. Previous count = 1, current count = 0. If I give the sql transaction a name then Cannot roll back SqlTransaction. No transaction or savepoint of that name was found. Transaction count after EXECUTE indicates that a COMMIT or ROLLBACK TRANSACTION statement is missing. Previous count = 1, current count = 2. some times it seems the count goes up, until the app completely exits The c# is just using (TransactionScope scope = new TransactionScope()) { ... Execute Sql scope.Commit() }

    Read the article

  • Storage setup for large files

    - by Mecca
    I need to store over 200TB of data (all types, biggest being video files) and be able to access it over a local network. The files will be accessed for editing or searches. I don't need versioning, but a setup that would keep me safe from harddrive failures would be nice. Right now the content is on different harddrives, some external drives, some regular. I don't exclude the possibility of buying new/extra drives if necessary. If they will ever be exposed to the web, it wont be to the public, but just a couple of people. I have no idea what to buy to make this happen. I see some NAS solutions over the internet like this http://www.bestbuy.com/site/a/2266043.p?id=1218317764591&skuId=2266043 but the storage is not enough, plus it doesn't seem to be scalable. What do you recommend? Thanks

    Read the article

  • Shell script to copy files

    - by Hulk
    If there exist a directory, /backup/ And the files in it are a.gz b.gz c.gz And another directory /backup-directorybackup And the files in it are a.gz I need a shells script to compare two directories if the files are present then ignore it and if the filesare not present copy it to the destination directory Thanks..

    Read the article

  • Perfectly reproducable select statement default ordering issue....

    - by Dave
    Hi, I've recently been chasing an issue with a client's db... solution found, but impossible to recreate. Essentially, we're doing a Select * from mytable where ArbitraryColumn = 75 Where MyTable has an Identity column, called 'MyIndentityColumn' - incremented by one in each insert. Naturally, and normally I would assume that the order returned would be the order in which they are inserted (bad assumption, but one which was forced onto me, through an inherited application - which has been patched). Essentially, I would like suggestions as to why the database, when restored to my local machine (same OS, same SQL server version - 200 sp3) same collation, and same backup instance restored on it, as a test DB on the client site. When I perform the above select, I get them in order of insert (i.e. identity column ordered ascending). On the client, it seems random (but the same 'random' order each time)... A few other points: I have the same collation on my test server as client Same DB backup restored to a test only I can access Same SQL server version and service pack Same OS Test DB is a new DB - new log and MDF... I have the problem 'solved' by adding an explicit order by clause but I want to undertand the cause of the issue, given the exact nature of my attempts to recreate it beuing futile, and perfectly recreatable on the client server... Thanks in advance, Dave

    Read the article

  • How to manage Libraries/jar files in eclipse?

    - by pvsnp
    I might be missing something but how do you manage Java projects in eclipse that need a lot of Jar files. I know maven manages libraries well if there are new updates but maybe I'm missing something, is there a way that eclipse can update new jar files (it would be especially useful for projects using apache-commons, say). I don't want to sound like asking for a feature request, but I'm looking at if there are ways to keep libraries jar files that a Java project uses to keep them updated automatically the way maven does. With more languages coming with this type of features, finding the right Jar files probably should be easier than this.

    Read the article

  • XCode 4.3 : XIB files and localisation

    - by Fabrice MAUPIN
    I have a problem with XIB Files and localizations (Xcode 4.3, Mac Os X 10.7.4) My application supports french and english localizations. For my test, i decided to change the languages and regions from "french" to "english" in system preferences. When i launch my application, it displays always old XIB files (french) et not the XIB files "attached" to my new localization ! ** I followed all the recommendations which i was able to find : i cleared all caches, clean up the project, ... and so on ! The problem is always here and persists. Can be that XCode4 has the other files to delete somewhere else ? is it possible to use another means to test my new localization ? If you are a idea, ... Thanks in advance. FM.

    Read the article

  • Grep all files in a directory and print matches with file name

    - by javanix
    I have a list of log files that I create as part of a video encoding script that I wrote. I would like to search all of them and print out certain statistics from the encode - how fast they were encoded, what settings were used, etc. I can search for the average framerate in one file via this 1 liner: cat ${filename} | grep average which outputs: work: average encoding speed for job is 23.211176 fps and search for the ratefactor: cat ${filename} | grep RF I would like to search all files in the directory and print off one, or prefereably both pieces of information along with the filename. Is there any way I can use find or grep to get this in a one-liner, or do I need to write a script? I would like output like this: /home/javanix/filename.log <RF line> <average line> I would like this to either work using FreeBSD 9 or Ubuntu 12.04.

    Read the article

  • Dreamweaver Files uploaded to Win 2008 server cause login prompt

    - by Lil
    I have a customer who uses a 4 year old version of Dreamweaver to edit her webpages. My hosting reseller account is with a company that uses Windows Server 2008. Every time my customer edits a page and uploads it, I have to set the permissions for that file to be readable, manually from the site's control panel. The customer is furious with me because her files cause the login prompt. I am able to upload files myself that remain readable to the site with both Filezilla and with Frontpage. I am assuming that her Dreamweaver settings are the cause of the problem but I don't have that program myself and don't know what to advise her. Any suggestions?

    Read the article

< Previous Page | 555 556 557 558 559 560 561 562 563 564 565 566  | Next Page >