Search Results

Search found 47712 results on 1909 pages for 'looking for a script'.

Page 395/1909 | < Previous Page | 391 392 393 394 395 396 397 398 399 400 401 402  | Next Page >

  • Crafting an effective php/web programmer job post template [closed]

    - by Tchalvak
    I am looking to create a job post to get a satisfactory assistant programmer / templater. Specifically, a php & web programmer. I am, however, afraid of forgetting important things. So, are there resources you can suggest for templates for things to ask and things to tell in a job post for a programmer? Surprisingly, I wasn't able to find similar questions on this site, so there may be duplicate questions out there that I could use but just didn't find. Right now I know that my -requirements- are so generic that they're going to get me in trouble with a spam of applications. e.g. the candidate must know php, must be able to seperate php from html. So I'm looking for criteria that are must-haves, must-mentions, or a general template to try to avoid a "lemon". I also started a gist to work on a job post, comments/edits would be excellent: https://gist.github.com/2906808

    Read the article

  • What was the first programming language written for computers?

    - by ThePlan
    Looking at so many programming languages we have today, each one being unique in it's own way, I've tried to figure out what the first programming language written for computers is. Looking at the release date for the popular ones I got somewhat close but I didn't look at less obvious ones, programming languages which are either dead or very little use nowadays. Fortran is the closest thing I got but I don't know if it's real. In a nutshell: What was the first programming language written for computers? Are there any languages that derived from that language?

    Read the article

  • CMD file time not always matching windows explorer file time

    - by skyrail
    I have a set of file I need to set the created, modified and last access date to exif date taken value, after a copy between 2 folders (might be fat32 on memory card or ntfs on fixed or usb disk). When I copy a file, the date and time switch to the current date. Then I change all 3 dates manually, either with change attributes in windows explorer or far manager on the command line. To make it faster I wrote a batch script getting original file dates (with php and function stat), building a batch script that invoke nircmd setfiletime for each file. Then I apply this batch to the copied version. The operation is relatively fast and reliable. Unfortunately, a bunch of files have last access and created time different in cmd and windows explorer (1H difference). Very strangely, it happens with dates between november and february, which make the operation unreliable. Why is this happening, and how can I fix it?

    Read the article

  • Wireless switch on Dell XT2 - strange behaviour of rfkill

    - by DyP
    I have an Dell Latitude XT2 using an Intel WLAN card (lspci lists it as "Intel Corporation Ultimate N WiFi Link 5300") running Lubuntu 12.04 with recent updates. The laptop has a hardware WLAN switch. I have problems activating the WLAN when booting with the hardware switch set to "off". The situation is a bit confusing, unfortunately. rfkill lists two WLAN devices (though lspci only shows the Intel one). This is the situation when booting with the hardware switch set to "Off": 0: dell-wifi: Wireless LAN Soft blocked: yes Hard blocked: yes 1: dell-bluetooth: Bluetooth Soft blocked: yes Hard blocked: yes 2: phy0: Wireless LAN Soft blocked: yes Hard blocked: yes From some tests, I conclude WLAN is only activated when both, the dell-wifi and phy0, are unblocked by soft- and hardware. But I can only unblock dell-wifi after the hardware switch is set to "on". Procedure right from boot with hardware switch set to "Off": Soft-unblocking phy0 works as expected. Could be done by start-up script. sudo rfkill unblock 0: nothing happens. Soft block of dell-wifi not removed. Set the hardware switch to "on": phy0 gets its hard block removed. Still no WLAN. sudo rfkill unblock 0: both the soft and hard lock of dell-wifi are removed. WLAN is now active and works. sudo rfkill block 0: only adds the soft block as expected. WLAN goes off again. So, in order to activate WLAN, I have to use the hardware switch and afterwards (manually) run a script - that's a bit inconvenient. Does someone know a better solution? Maybe a daemon could help that listens to rfkill events to unblock dell-wifi after I have set the hardware switch to "on"? (sounds like another workaround) When booting with the hardware switch set to "On", nothing is blocked neither hard nor soft.

    Read the article

  • Is it possible to extend a 504 timeout in nginx on a per location basis

    - by codecowboy
    Is it possible to set timeout directives within a location block to prevent nginx returning a 504 from a long running PHP script (PHP-FPM? location /myurlsegment/ { client_body_timeout 1000000; send_timeout 1000000; fastcgi_read_timeout 1000000; } This has no effect when making a request to example.com/myurlsegment. The timeout occurs after approximately 60 seconds. PHP is configured to allow the script to run until completion (set_time_limit(0)) I don't want to set a global timeout for all scripts.

    Read the article

  • How to bring application to front every 15 minutes on Mac OS X Lion?

    - by johnnyb10
    I have a free or cheap little app called Desktop Task Timer LE that I've been using to track my time as I work on various projects. I'd like to have it pop up as the foreground app every 15 minutes to prevent me from forgetting to stop/change the timer after I've moved on to a different task. I know I can have the app launch using a script in Automator or AppleScript, but I don't know how to have that script fire off every 15 minutes. I've read about using Launchd and iCal, but I'm still not sure how to do it. (Actually, iCal is probably simple, but I'd like to avoid using it for this.) Any ideas? Also, a further feature would be to have it pop up after 5 (or x) minutes of inactivity on the computer. Not sure if this would work for my needs, but I'd like to test it if possible.

    Read the article

  • Migrating users and mailboxes from postfix / Maildir to Postfix with Mysql backend [closed]

    - by Chrispy
    Possible Duplicate: Migrating users and mailboxes from postfix / Maildir to Postfix with Mysql backend So I've got 60 or so users on a hand rolled postfix installation on openbsd and I'd like to move their mailboxes to our new mail server running iRedMail (postfix, vmail/mysql back end) Does anyone know of a good way to do this? Preferably a script I can run to keep syncing the users mailboxes as MX records get updated? I presume one way (though I don't have all their passwords!) would be to have a command line imap client that simulated the users copying their mail themselves but I'm sure there must be a shell / php script to migrate users? Anyone got any bright ideas? Chris.

    Read the article

  • Secure copy uucp style

    - by Alexander Janssen
    I often have the case that I have to make a lot of hops to the remote host, just because there is no direct routing between my client and the remote host. When I need to copy files from a remote host two or more hops away, I always have to: client$ ssh host1 host1$ ssh host2 host2$ scp host3:/myfile . host2$ exit host1$ scp host2:myfile . host1$ exit client$ scp host1:myfile . Back when uucp still was being used this would be as simple as a uucp host1!host2!host3 /myfile . I know that there's uucp over ssh, but unfortunately I don't have the proper privileges on those machines to set it up. Also, I'm not sure if I really want to fiddle around with customer's machines. Does anyone know of a method doing this tasks without the need to setup a lot of tunnels or deploying new software to remote hosts? Maybe some kind of recursive script which clones itself to all the remote hosts, doing the hard work for me? Assume that authentication takes place with public keys and that all hosts do SSH Agent Forwarding. Edit: I'm not looking for a way to automatically forwarding my interactive sesssion to the nexthop host. I want a solution to copy files bangpath-style using scp via multiple hops without the need to install uucp on any of those machines. I don't have the (legal) rights or the privileges to make permanent changes to the ssh-config. Also, I'm sharing this username and hosts with a lot of other people. I'm willing to hack up my own script, but I wanted to know if anyone knows something which already does it. Minimum-invasive changes to hosts on the bangpath, simple invocation from the client. Edit 2: To give you an impression of how it's properly been done in interactive sessions, have a look at the GXPC clustershell. This is basically a Python-script, which spwans itself over to all remote hosts which have connectivity and where your ssh-key is installed. The great thing about it is, that you can tell "I can reach HostC via HostB via HostA." It just works. I want to have this for scp.

    Read the article

  • linux installation cd enviroment

    - by haw3d
    i recently make a custom linux system, for my special need. its on my HDD, but i want to create a cd for installation. multiple day ego, i found a livecd and create install script for that, but in power failure my HDD is gone, and i cant found that live cd again. my install script is based on recoverin tar.gz backup. my requorement is: based on glibc (not uclibc) recognize every devices have you any suggestion? excuse me for my bad english.

    Read the article

  • Focusing and Selecting the Text in ASP.NET TextBox Controls

    When a browser displays the HTML sent from a web server it parses the received markup into a Document Object Model, or DOM, which models the markup as a hierarchical structure. Each element in the markup - the <form> element, <div> elements, <p> elements, <input> elements, and so on - are represented as a node in the DOM and can be programmatically accessed from client-side script. What's more, the nodes that make up the DOM have functions that can be called to perform certain behaviors; what functions are available depend on what type of element the node represents. One function common to most all node types is focus, which gives keyboard focus to the corresponding element. The focus function is commonly used in data entry forms, search pages, and login screens to put the user's keyboard cursor in a particular textbox when the web page loads so that the user can start typing in his search query or username without having to first click the textbox with his mouse. Another useful function is select, which is available for <input> and <textarea> elements and selects the contents of the textbox. This article shows how to call an HTML element's focus and select functions. We'll look at calling these functions directly from client-side script as well as how to call these functions from server-side code. Read on to learn more! Read More >

    Read the article

  • How do I get PHP to work with UserDir

    - by Callmeed
    I've got a fresh CentOS 5.5 box and have installed Webmin+VirtualMin 3.79. I've enabled UserDir in apache and the sites are visible via http://ipaddress/~user/ but PHP does not work. (PHP works fine if I visit the site via it's domain) Here's what I put in my httpd.conf to get where I'm at: <IfModule mod_userdir.c> UserDir public_html </IfModule> <Directory /home/*/public_html> Options -Indexes +IncludesNOEXEC +FollowSymLinks +ExecCGI allow from all AllowOverride All AddHandler fcgid-script .php AddHandler fcgid-script .php5 </Directory> When I try to hit a PHP file, I get a 500 error and the following is logged to /var/log/httpd/error_log: suexec failure: could not open log file fopen: Permission denied Any help/direction is appreciated.

    Read the article

  • Upgrade MySQL to 5.5 on Lucid, upgrade server to Precise or switch to Percona?

    - by xref
    Looking into upgrading mysql on our development server to which is running 10.04 so is stuck at MySQL 5.1, as it appears there is no apt-get support for upgrading to 5.5 except by certain 3rd party PPAs. So I'm looking for which route to take and what other people have done: a) Follow a couple year old guide to manually install MySQL 5.5 and then invest ongoing time into manually downloading and installing security updates every month or two? b) Upgrade 10.04 to 12.04, and from other peoples experience I work with spend several days working out the kinks of that large upgrade, then I'll have access to mysql 5.5 and easy apt-get installation of future security updates? c) Switch from MySQL to Percona Server 5.5 and get all the benefits of that version of mysql, plus easy apt-get updates with their PPA? d) Something else?

    Read the article

  • Analytics on Mobile Phones

    - by Samuh
    Tracking events and setting up Analytics for Websites seems easy. You create an account with one of the Analytics service providers like Google. They give you javascript code that you embed in your pages (whichever event you wish to track) and voila..you're done. I have written a native application for Android phones, which is actually an adaptation of the actual web site. Now, I am required to setup Analytics and tracking for this native application. Question: How to do this on Mobile phones from within a native application? We have Java Script code that works for the original web site. Is there a way to incorporate that in the native application? I know Android supports Java Script via WebViews(Webkit);my application does not have webviews and it is native. Also, I have not worked on JavaScript since school so excuse me if I sound naive. Thanks.

    Read the article

  • Binding services to localhost and using SSH tunnels - can requests be forged?

    - by Martin
    Given a typical webserver, with Apache2, common PHP scripts and a DNS server, would it be sufficient from a security perspective to bind administration interfaces like phpmyadmin to localhost and access it via SSH tunnels? Or could somebody, who knew eg. that phpmyadmin (or any other commonly availible script) is listening at a certain port on localhost easily forge requests that would be executed if no other authentication was present? In other words: could somebody from somewhere in the internet easily forge a request, so that the webserver would accept it, thinking it originated from 127.0.0.1 if the server is listening on 127.0.0.1 only? If there were a risk, could it be somehow dealt with on a lower level than the application, eg. by using iptables? The idea being, that if someone found a weakness in a php script or apache, the network would still block this request because it did not arrive via a SSH-tunnel?

    Read the article

  • Javascript widgets: do links count as SEO backlinks? [closed]

    - by j0nes
    Possible Duplicate: How good is it for SEO if you have a widget that lives on other sites? On my website I offer an option to let users embed information from my site with some kind of "homepage widget". If a user wants to embed it in his website, he basically has to add one line of Javascript to his HTML files like this: <script src="http://mysite.com/myscript.php?some_options_here"></script> Inside the widget, I export some content from my website and of course create a link back to my website. This is done in Javascript with document.write. document.writeln("My great exported content"); document.writeln('<a href="http://mysite.com?ref=widget>Check mysite.com</a>'); I have Google Analytics set up to track whether the links in there get clicked, and they do. Now I am asking myself if Google recognizes these links as valid backlinks from the embedding domain. I know that Googlebot can parse and execute Javascript, but I have not found any references whether these links also count as "normal" backlinks.

    Read the article

  • "service"-command and environment variables

    - by varesa
    I am trying to start a service that requires a env. variable to be set to certain path. I set this variable in "/etc/profile.d/". However when I start this service using the service command, it doesn't work. man service: service runs a System V init script in as predictable environment as possible, removing most environment variables and with current working directory set to /. So it seems that service is removing my variables. How should I set the variables up to keep them from being removed. Or is that something i should not do. I could start the service manually using the init-scripts, or even hardcode the path into the script, but I'd like to know how to use it with the service command.

    Read the article

  • How do I configure ubuntu server's iptables to allow java without opening the floodgates?

    - by rofls
    I'm new to servers, so please bear with me. I have my amateur site running. Problem is, I followed Rackspace's instructions on setting up iptables and am pretty sure that's why the java server I'm trying to use on port 8080 isn't working (it runs the script but my android test app doesn't connect to it). When I try running the same java server script on port 80 it doesn't even start. I also ran nmap on my domain and saw that indeed only port 80 and 22 (for ssh) are responding. Is it possible to run Java and apache happily on the same server? If so, how can I configure my iptables correctly. (I'm aware that I should probably do some sort of filtering in the java server itself, but will figure that out later).

    Read the article

  • Errors when installing updates

    - by user71613
    I am getting the following errors when installing updates. They started to appear after I upgraded my system to 12.04. Errors were encountered while processing: samba-common samba-common-bin samba grub-pc grub-gfxpayload-lists Setting up samba-common (2:3.6.3-2ubuntu2.2) ... perl: error while loading shared libraries: libperl.so.5.12: cannot open shared object file: No such file or directory dpkg: error processing samba-common (--configure): subprocess installed post-installation script returned error exit status 127 dpkg: dependency problems prevent configuration of samba-common-bin: samba-common-bin depends on samba-common (>= 2:3.4.0~pre1-2); however: Package samba-common is not configured yet. dpkg: error processing samba-common-bin (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of samba: samba depends on samba-common (= 2:3.6.3-2ubuntu2.2); however: Package samba-common is not configured yet. samba depends on samba-common-bin; however: Package samba-common-bin is not configured yet. dpkg: error processing samba (--configure): dependency problems - leaving unconfigured Setting up grub-gfxpayload-lists (0.6) ... Setting up grub-pc (1.99-21ubuntu3.1) ... perl: error while loading shared libraries: libperl.so.5.12: cannot open shared object file: No such file or directory dpkg: error processing grub-pc (--configure): subprocess installed post-installation script returned error exit status 127 Any ideas how to fix this?

    Read the article

  • Eliminating zero-length files

    - by RhZ
    I have been having multiple crashes recently. 4-5 last night within a few hours. I posted about it before, and got an answer but not sure how to proceed. The messages in my logs right before the crash are multiple complaints about valid eCryptfs headers. But the chron might not be related, I don't think I saw that in previous crashes: xxx-desktop kernel: [ 1112.274474] Valid eCryptfs headers not found in file header region or xattr region, inode 32376924 xxx-desktop CRON[4212]: (root) CMD ( cd / && run-parts --report /etc/cron.hourly) So I was sent to an answer providing this script: for i in find $(mount | grep " on $HOME type ecryptfs" | awk '{print $1}') -size 0c; do if ! fuser -v $i; then rm -f $i fi done I did find some zero byte files, not in the exactly right place (a folder called .private as I remember), but I need to fix this, its too bad right now. So I need to delete any of them that are not in use. I am a little too clueless, can someone walk me through executing this script? I don't know how.

    Read the article

  • How do you properly word a Google search when you don't even have a solution in mind? [closed]

    - by Bruno Romaszkiewicz
    So, I'm stuck on a problem, looking for a solution, my rubber duck can't help me, my co-workers can't help me. Next natural step is research, right? Google can help me, He always can. Or so I'm told. My problem is, I never found much use for Google when looking for a programming solution, it's very useful for finding how to implement one, but when you don't even know where to start, how do you properly word a Google search? Is there any other option?

    Read the article

  • Finding Locked Out Users

    - by Bart Silverstrim
    Active Directory up to 2008 network (our servers are a mix of 2008, 2003...) I'm looking for a quick way to query AD to find out what users are locked out, preferably from a batch or script file, to monitor for possible issues with either user accounts being attacked by an automated attack or just anomalies in the network. I've Googled and my Google-fu has failed; I found a query off Microsoft's own knowledgebase that cites a string to use on Server 2003 with the management snap-in's saved queries (http://support.microsoft.com/kb/555131) but when I entered it, the query returned 400 users that a spot-check showed did NOT have a checkmark in the "Account is locked out" box under "account." In fact, I don't see anything wrong with their accounts. Is there a simple utility (wisesoft bulkadusers apparently uses this method behind the scenes, since it's results were also wrong) that will give a count of users and possibly their user object names? Script? Something?

    Read the article

  • What are the best Small Business Servers/Storage

    - by nasty
    I am a web designer/developer. Work mostly with large files 50mb+. I currently have a MyBook Live which is connected wirelessly to my MacBook Pro and Dell desktop PC running windows. Since its connected wirelessly(it doesnt have ethernet port) the files are loading slow and its hard to work staright away from my server. Now im looking for better storage/server solutions and looking at dell.com.au/servers. Im not sure which one to choose. Can you guys give me some suggestions whether I should buy the AUD599 or should I upgrade. Is there any stack guys haveing the same issue as me? t

    Read the article

< Previous Page | 391 392 393 394 395 396 397 398 399 400 401 402  | Next Page >