Search Results

Search found 67192 results on 2688 pages for 'excel external data'.

Page 268/2688 | < Previous Page | 264 265 266 267 268 269 270 271 272 273 274 275  | Next Page >

  • Invoking external shell application through RubyCocoa

    - by Shyam
    Hi, How would I invoke an external application through the RubyCocoa bridge? I read something about NSTask, yet I have no idea how this should be translated in RubyCocoa. My goal is to have an IB_action to trigger a method which will run a terminal application with some parameters, i.e. ls -p $mydir, where mydir comes from an IB_outlet such as an Textfield. Any directions or help would be greatly appreciated!

    Read the article

  • Serve external template in Django

    - by AlexeyMK
    Hey, I want to do something like return render_to_response("http://docs.google.com/View?id=bla", args) and serve an external page with django arguments. Django doesn't like this (it looks for templates in very particular places). What's the easiest way make this work? Right now I'm thinking to use urllib to save the page to somewhere locally on my server and then serve with the templates pointing to there. Note: I'm not looking for anything particularly scalable here, I realize my proposal above is a little dirty.

    Read the article

  • Caching Authentication Data

    - by PartlyCloudy
    Hi, I'm currently implementing a REST web service using CouchDB and RESTlet. The RESTlet layer is mainly for authentication and some minor filtering of the JSON data served by CouchDB: Clients <= HTTP = [ RESTlet <= HTTP = CouchDB ] I'm using CouchDB also to store user login data, because I don't want to add an additional database server for that purpose. Thus, each request to my service causes two CouchDB requests conducted by RESTlet (auth data + "real" request). In order to keep the service as efficent as possible, I want to reduce the number of requests, in this case redundant requests for login data. My idea now is to provide a cache (i.e.LRU-Cache via LinkedHashMap) within my RESTlet application that caches login data, because HTTP caching will probabily not be enough. But how do I invalidate the cache data, once a user changes the password, for instance. Thanks to REST, the application might run on several servers in parallel, and I don't want to create a central instance just to cache login data. Currently, I save requested auth data in the cache and try to auth new requests by using them. If a authentication fails or there is now entry available, I'll dispatch a GET request to my CouchDB storage in order to obtain the actual auth data. So in a worst case, users that have changed their data will perhaps still be able to login with their old credentials. How can I deal with that? Or what is a good strategy to keep the cache(s) up-to-date in general? Thanks in advance.

    Read the article

  • MYSQL join - reference external field from nested select?

    - by PHP thinker
    Is it allowed to reference external field from nested select? E.g. SELECT FROM ext1 LEFT JOIN (SELECT * FROM int2 WHERE int2.id = ext1.some_id ) as x ON 1=1 in this case, this is referencing ext1.some_id in nested select. I am getting errors in this case that field ext1.some_id is unknow. Is it possible? Is there some other way?

    Read the article

  • Prevent DTD download when parsing XML, redux

    - by harpo
    I have a folder full of XHTML files, and a console program to process them. The problem is, it doesn't work when I'm offline, because The remote name could not be resolved: 'www.w3.org' A number of sites, including this one, say to set the XmlResolver to null. That is not not working. I have tried setting the XmlResolver to null every way that I can think of: in the XmlDocument, in XmlTextReader, and in the XmlReaderSettings. I believe that it's just creating a default instance. I have also tried implementing my own XmlResolver, but I don't know what to return for external items. public class NonXmlResolver : XmlUrlResolver { public override object GetEntity( Uri absoluteUri, string role, Type ofObjectToReturn ) { if ( absoluteUri.Scheme == "http" ) throw new Exception("What you talking 'bout, Willis?"); else return base.GetEntity( absoluteUri, role, ofObjectToReturn ); } } The documents do not use any special entities, only pure XML. This is possible, right?

    Read the article

  • Run a external program with specified max running time

    - by jack
    I want to execute an external program in each thread of a multi-threaded python program. Let's say max running time is set to 1 second. If started process completes within 1 second, main program capture its output for further processing. If it doesn't finishes in 1 second, main program just terminate it and start another new process. How to implement this?

    Read the article

  • changing user in ubuntu

    - by Rahul Mehta
    Hi , this is my ls -all, the zfapi folder have the root right , how can i change this to www-data. Also Please advise what is the first root and secont root is ? Thanks drwxr-xr-x 4 www-data www-data 4096 2011-01-06 18:21 cdnapi -rw-r--r-- 1 www-data www-data 678 2010-08-30 12:02 config.js drwxr-xr-x 4 www-data www-data 4096 2010-11-23 15:55 css drwxr-xr-x 7 www-data www-data 4096 2010-11-17 13:12 images -rw-r--r-- 1 www-data www-data 25064 2010-12-17 18:26 index.html -rw-r--r-- 1 www-data www-data 19830 2010-12-18 11:24 init.js drwxr-xr-x 2 www-data www-data 4096 2010-12-02 12:34 lib -rw-r--r-- 1 www-data www-data 18758 2010-12-06 18:00 styles.css -rw-r--r-- 1 www-data www-data 1081 2010-10-21 17:56 testbganim.html drwxr-xr-x 2 www-data www-data 4096 2010-12-17 11:15 yapi drwxr-xr-x 7 root root 4096 2011-01-07 18:20 zfapi

    Read the article

  • AIR: sync gui with data-base?

    - by John Isaacks
    I am going to be building an AIR application that shows a list (about 1-25 rows of data) from a data-base. The data-base is on the web. I want the list to be as accurate as possible, meaning as soon as the data-base data changes, the list displayed in the app should update asap. I do not know of anyway that the air application could be notified when there is a change, I am thinking I am going to have to poll the data-base at certain intervals to keep an up to date list. So my question is, first is there any way to NOT have to keep checking the data-base? or if I do keep have to keep checking the data-base what is a reasonable interval to do that at? Thanks.

    Read the article

  • Internal class and access to external members.

    - by Knowing me knowing you
    I always thought that internal class has access to all data in its external class but having code: template<class T> class Vector { template<class T> friend std::ostream& operator<<(std::ostream& out, const Vector<T>& obj); private: T** myData_; std::size_t myIndex_; std::size_t mySize_; public: Vector():myData_(nullptr), myIndex_(0), mySize_(0) { } Vector(const Vector<T>& pattern); void insert(const T&); Vector<T> makeUnion(const Vector<T>&)const; Vector<T> makeIntersection(const Vector<T>&)const; class Iterator : public std::iterator<std::bidirectional_iterator_tag,T> { private: T** itData_; public: Iterator()//<<<<<<<<<<<<<------------COMMENT { /*HERE I'M TRYING TO USE ANY MEMBER FROM Vector<T> AND I'M GETTING ERR SAYING: ILLEGAL CALL OF NON-STATIC MEMBER FUNCTION*/} Iterator(T** ty) { itData_ = ty; } Iterator operator++() { return ++itData_; } T operator*() { return *itData_[0]; } bool operator==(const Iterator& obj) { return *itData_ == *obj.itData_; } bool operator!=(const Iterator& obj) { return *itData_ != *obj.itData_; } bool operator<(const Iterator& obj) { return *itData_ < *obj.itData_; } }; typedef Iterator iterator; iterator begin()const { assert(mySize_ > 0); return myData_; } iterator end()const { return myData_ + myIndex_; } }; See line marked as COMMENT. So can I or I can't use members from external class while in internal class? Don't bother about naming, it's not a Vector it's a Set. Thank you.

    Read the article

  • SSIS Handling Extenal Issues

    - by durilai
    I have an SSIS package that works fine. The package runs every night and takes about 4 hours to complete. I have am a newb to SSIS, so I want to see what my options are. I am not finding anything on the web about these two issues, so any advice is greatly appreciated. What to do when I have an external issue such as a power failure/accidental restart. Is there a way to alert someone or have the package begin again on restart. A couple weeks ago there was a process that got hung and locked table, making the process not execute. How is the best way to handle ensuring I have the proper access before starting and if not, get the access. I am ok with killing the processes etc. Looking for best practice info. Thanks

    Read the article

  • HLSL How can one pass data between shaders / read existing colour value?

    - by RJFalconer
    Hello all, I have 2 HLSL ps2.0 shaders. Simplified, they are: Shader 1 Reads texture Outputs colour value based on this texture Shader 2 Needs to read in existing colour (or have it passed in/read from a register) Outputs the final colour which is a function of the previous colour (They need to be different shaders as I've reached the maximum vertex-shader outputs for 1 shader) My problem is I cannot work out how Shader 2 can access the existing fragment/pixel colour. Is the only way for shaders to interact really just the alpha blending options? These aren't sufficient if I want to use the colour as input to my function.

    Read the article

  • Use queried json data in a function

    - by SztupY
    I have a code similar to this: $.ajax({ success: function(data) { text = ''; for (var i = 0; i< data.length; i++) { text = text + '<a href="#" id="Data_'+ i +'">' + data[i].Name + "</a><br />"; } $("#SomeId").html(text); for (var i = 0; i< data.length; i++) { $("#Data_"+i).click(function() { alert(data[i]); RunFunction(data[i]); return false; }); } } }); This gets an array of some data in json format, then iterates through this array generating a link for each entry. Now I want to add a function for each link that will run a function that does something with this data. The problem is that the data seems to be unavailable after the ajax success function is called (although I thought that they behave like closures). What is the best way to use the queried json data later on? (I think setting it as a global variable would do the job, but I want to avoid that, mainly because this ajax request might be called multiple times) Thanks.

    Read the article

  • Jquery: Calling functions from different documents

    - by Tom
    Hi, I've got some Jquery functions that I keep in a "custom.js" file. On some pages, I need to pass PHP variables to the Jquery so some Jquery bits need to remain in the HTML documents. However, as I'm now trying to refactor things to the minimum, I'm tripping over the following: If I put this in my custom.js: $(document).ready(function() { function sayHello() { alert("hello"); } } And this in a HTML document: <script type="text/javascript"> $(document).ready(function() { sayHello(); }); </script> ... the function doesn't get called. However, if both are placed in the HTML document, the function works fine. Is there some kind of public property I need to declare for the function or how do I get Jquery functions in my HTML to talk to external .js files? They're correctly included and work fine otherwise. Thanks.

    Read the article

  • How do I display data in a table and allow users to copy selected data?

    - by cfouche
    Hi I have a long list of data that I want to display in table format to users. The data changes when the user performs certain actions in my app, but it is not directly editable. So the user can create a reasonably big table of data, but he can't change individual cells' values. However, I do want the data to be copy-able. So I want it to be possible for the user to select some or all of the cells, and do a ctrl-C to copy the data to his clipboard, and then a ctrl-V to paste the data to an external text editor. At the moment, I'm displaying the data in a ListView with a GridView and this works perfectly, except that GridView doesn't allow one to copy data. What other options can I try? Ours is a WPF app, coding in c#.

    Read the article

  • Executing external executable to run and wait till it finishes and continue the setup using NSIS

    - by Ramesh
    I am new to NSIS install creator and I need to run an external executable because this is an prerequisite and once it if finished i will be continuing the setup. I tried the below code but it just copies the exe to the installation path. Section "example" example SetOutPath "$INSTDIR" MessageBox MB_OK \ "The applications." File "Prerequisites\setup.exe" ExecWait '"exec" /i "$INSTDIR\setup.exe" /passive' SetRebootFlag true SectionEnd

    Read the article

  • How to use a data type (table) defined in another database in SQL2k8?

    - by Victor Rodrigues
    I have a Table Type defined in a database. It is used as a table-valued parameter in a stored procedure. I would like to call this procedure from another database, and in order to pass the parameter, I need to reference this defined type. But when I do DECLARE @table dbOtherDatabase.dbo.TypeName , it tells me that The type name 'dbOtherDatabase.dbo.TypeName' contains more than the maximum number of prefixes. The maximum is 1. How could I reference this table type?

    Read the article

  • Copy and paste between sheets in a workbook with VBA code

    - by Hannah
    Trying to write a macro in VBA for Excel to look at the value in a certain column from each row of data in a list and if that value is "yes" then it copies and pastes the entire row onto a different sheet in the same workbook. Let's name the two sheets "Data" and "Final". I want to have the sheets referenced so it does not matter which sheet I have open when it runs the code. I was going to use a Do loop to cycle through the rows on the one data sheet until it finds there are no more entries, and if statements to check the values. I am confused about how to switch from one sheet to the next. How do I specifically reference cells in different sheets? Here is the pseudocode I had in mind: Do while DataCells(x,1).Value <> " " for each DataCells(x,1).Value="NO" if DataCells(x,2).Value > DataCells(x,3).Value or _ DataCells(x,4).Value < DataCells(x,5).Value 'Copy and paste/insert row x from Data to Final sheet adding a new 'row for each qualifying row else x=x+1 end else if DataCells(x,1).Value="YES" Loop 'copy and paste entire row to a third sheet 'continue this cycle until all rows in the data sheet are examined

    Read the article

  • CSC folder data access AND roaming profiles issues (Vista with Server 2003, then 2008)

    - by Alex Jones
    I'm a junior sysadmin for an IT contractor that helps small, local government agencies, like little towns and the like. One of our clients, a public library with ~ 50 staff users, was recently migrated from Server 2003 Standard to Server 2008 R2 Standard in a very short timeframe; our senior employee, the only network engineer, had suddenly put in his two weeks notice, so management pushed him to do this project before quitting. A bit hasty on management's part? Perhaps. Could we do anything about that? Nope. Do I have to fix this all by myself? Pretty much. The network is set up like this: a) 50ish staff workstations, all running Vista Business SP2. All staff use MS Outlook, which uses RPC-over-HTTPS ("Outlook Anywhere") for cached Exchange access to an offsite location. b) One new (virtualized) Server 2008 R2 Standard instance, running atop a Server 2008 R2 host via Hyper-V. The VM is the domain's DC, and also the site's one and only file server. Let's call that VM "NEWBOX". c) One old physical Server 2003 Standard server, running the same roles. Let's call it "OLDBOX". It's still on the network and accessible, but it's been demoted, and its shares have been disabled. No data has been deleted. c) Gigabit Ethernet everywhere. The organization's only has one domain, and it did not change during the migration. d) Most users were set up for a combo of redirected folders + offline files, but some older employees who had been with the organization a long time are still on roaming profiles. To sum up: the servers in question handle user accounts and files, nothing else (eg, no TS, no mail, no IIS, etc.) I have two major problems I'm hoping you can help me with: 1) Even though all domain users have had their redirected folders moved to the new server, and loggin in to their workstations and testing confirms that the Documents/Music/Whatever folders point to the new paths, it appears some users (not laptops or anything either!) had been working offline from OLDBOX for a long time, and nobody realized it. Here's the ugly implication: a bunch of their data now lives only in their CSC folders, because they can't access the share on OLDBOX and sync with it finally. How do I get this data out of those CSC folders, and onto NEWBOX? 2) What's the best way to migrate roaming profile users to non-roaming ones, without losing vital data like documents, any lingering PSTs, etc? Things I've thought about trying: For problem 1: a) Reenable the documents share on OLDBOX, force an Offline Files sync for ALL domain users, then copy OLDBOX's share's data to the equivalent share on NEWBOX. Reinitialize the Offline Files cache for every user. With this: How do I safely force a domain-wide Offline Files sync? Could I lose data by reenabling the share on OLDBOX and forcing the sync? Afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? b) Determine which users have unsynced changes to OLDBOX (again, how?), search each user's CSC folder domain-wide via workstation admin shares, and grab the unsynched data. Reinitialize the Offline Files cache for every user. With this: How can I detect which users have unsynched changes with a script? How can I search each user's CSC folder, when the ownership and permissions set for CSC folders are so restrictive? Again, afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? c) Manually visit each workstation, copy the contents of the CSC folder, and manually copy that data onto NEWBOX. Reinitialize the Offline Files cache for every user. With this: Again, how do I 'break into' the CSC folder and get to its data? As an experiment, I took one workstation's HD offsite, imaged it for safety, and then tried the following with one of our shop PCs, after attaching the drive: grant myself full control of the folder (failed), grant myself ownership of the folder (failed), run chkdsk on the whole drive to make sure nothing's messed up (all OK), try to take full control of the entire drive (failed), try to take ownership of the entire drive (failed) MS KB articles and Googling around suggests there's a utility called CSCCMD that's meant for this exact scenario...but it looks like it's available for XP, not Vista, no? Again, afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? For problem 2: a) Figure out which users are on roaming profiles, and where their profiles 'live' on the server. Create new folders for them in the redirected folders repository, migrate existing data, and disable the roaming. With this: Finding out who's roaming isn't hard. But what's the best way to disable the roaming itself? In AD Users and Computers, or on each user's workstation? Doing it centrally on the server seems more efficient; that said, all of the KB research I've done turns up articles on how to go from local to roaming, not the other way around, so I don't have good documentation on this. In closing: we have good backups of NEWBOX and OLDBOX, but not of the workstations themselves, so anything drastic on the client side would need imaging and testing for safety. Thanks for reading along this far! Hopefully you can help me dig us out of this mess.

    Read the article

< Previous Page | 264 265 266 267 268 269 270 271 272 273 274 275  | Next Page >