Search Results

Search found 23098 results on 924 pages for 'multiple processes'.

Page 630/924 | < Previous Page | 626 627 628 629 630 631 632 633 634 635 636 637  | Next Page >

  • What term is used to describe running frequent batch jobs to emulate near real time

    - by Steven Tolkin
    Suppose users of application A want to see the data updated by application B as frequently as possible. Unfortunately app A or app B cannot use message queues, and they cannot share a database. So app B writes a file, and a batch job periodically checks to see if the file is there, and if load loads it into app A. Is there a name for this concept? A very explicit and geeky description: "running very frequent batch jobs in a tight loop to emulate near real time". This concept is similar to "polling". However polling has the connotation of being very frequent, multiple times per second, whereas the most often you would run a batch job would be every few minutes. A related question -- what is the tightest loop that is reasonable. Is it 1 minute of 5 minutes or ...? Recall that the batch jobs are started by a batch job scheduler (e.g. Autosys, Control M, CA ESP, Spring Batch etc.) and so running a job too frequently would causes overhead and clutter.

    Read the article

  • Cannot get Virtual PC to install on my machine

    - by Brandon
    I'm not sure what I'm doing wrong. I ran the tool to check that hardware virtualization is enabled, and it is (i7-2600). I ran the proper version of the Virtual PC download (Windows 7 Professional x64) and I can see the Windows Virtual PC (KB958559) update in my installed updates list. The .msu file runs fine without errors, but even after multiple reboots I can't actually find a trace of Virtual PC anywhere. There is nothing in program files, the start menu, or system32. Am I missing a step?

    Read the article

  • Web Farm Application deployment best practices

    - by rauts
    Hi All, We are having a web farm which hosts multiple ASP.Net applications. We typically have 4 servers on the farm. The dilemma which i am having is in terms of capacity issue of the farm. Lets say i have currently got 200 apps in total. Should I deploy all 200 apps on all 4 servers (i.e. all the servers in the farm are identical) or should i split the applications between 2 sets of server and create 2 smaller farms so that i can then manage the application based on its criticality and usage etc. Any best practices in this area would be highly appreciated. Thanks Rauts

    Read the article

  • Is there a way to force/manage file locking in windows?

    - by JPbuntu
    I have a 2 Windows machines networked and I am having trouble with simultaneous access to files. I would like only one user to be able to open a file at a time, which I thought was automatic, using file locks.... if the program used to access the file is locking the file. I believe the problem I am having is some of the programs I use, don't lock the file, and there for can be modified simultaneously by multiple users, which is very much not desired. Currently I am having this problem with only two computers, although as soon as I can figure out a solution to this problem the network is going to be expanded to 6 computers, which will include Windows 7, Vista, and XP, as well as a central file server (Samba). Is there a way to ensure that all files opened in windows get locked? Any suggestions are appreciated, thanks.

    Read the article

  • How should I organize my backups ?

    - by Patrick
    I'm using for the first time rsync to create daily backups of my websites and I was wondering if I should overwrite the previous copy or should I create multiple copies and overwrite only the oldest one ? (I might not have enough space for that, though). I actually have also this question. Let's suppose most of files are accidentally erases.. does rsync delete all these files from the backup space because they don't exist anymore ? How does exactly work in this case ? thanks

    Read the article

  • Autoplay for USB drive keeps popping up

    - by Adam Haile
    I have a Seagate external 2TB USB drive that when I leave it plugged in, the Autoplay dialog asking what to do with the drive pops up about once every 30-60 minutes. No matter how many times I tell it to go away, it keeps coming back. I have tried multiple power supplies, USB cables, and all the USB ports on my machine...but it keeps doing the same thing. Also, this does not happen with the same drive on a different computer. Could this be some power or configuration setting that is causing this? I'm running Windows 7 x64 (the computer it works on is x86)

    Read the article

  • Dual monitor not working completely in 12.10 after upgrade

    - by Mark Baldridge
    At 12.04, dual monitors worked perfectly. After upgrading to 12.10, the primary monitor works, the second monitor only partly works. I am sure there is some difference between the releases that I have missed setting properly. System settings - Displays show both correctly as Acer 22" monitors at 1680x1050 (16:10). An icon on monitor 2 is present, but elongated; almost an artifact, since other icons on the primary screen are absent, but this one icon is there on th second monitor. Selecting the icons on both screens exist. Painting is weird on monitor 2. Launcher exists and works on both screens, but even with sticky edges off, the cursor stops at the left edge of monitor 2. Clicking on text editor on screen 2 launcer will launch gedit there. If I drag it, it leaves a trail of after images like repaint is failing. If I drive the cursor on the launcher, the help tags like "LibreOffice Writer" appear, but stay on screen unless I drag the active gedit window over them. Then part of the help bubbles are overwritten, leaving behind after images of the gedit window on screen. What is really fascinating is that the System settings - Displays is now ignoring monitor selection, after allowing it earlier. Just before this, the help popup which said "Select a monitor to change its properties; drag to rearrange its placement" actually let me do that. Maybe a trick of where I grab the edge of the monitor in the Displays setting. I just found a working handle. When I drag monitor 1 to the right of monitor 2, "Apply" and confirm, both monitors work normally (although the right monitor lets the cursor slide off the right edge onto the left edge of monitor 1 - which sounds correct). Painting of windows does not leave an after image. However, success is only temporary. The setting survives the reboot, but painting on the left monitor, now monitor 2, now replicates the issues from before. The after image of the gedit window and the small window for "Are you sure you want to close all programs and restart the computer?" are still on monitor 2 (on the left now), even though they are not real windows, nor do they have processes behind them. Curiously, in Displays, the "green" monitor on the left in the display window is matched by the right monitor color in the monitor upper left corner. Probably makes sense as the one on the right is now monitor 1. If I repeat the "drag the left monitor to the right of the right monitor on the "Displays" window, things are oriented properly, with no display artifacts as I drag windows around either screen. Also the description bubbles that pop up are overwritten on both screens, so none of those artifacts either. This goodness does not survive a reboot, however. Have not tried logging out and back in. All of this after positing that the motherboard VGA and HDMI ports could have been the issue. So, I installed an e-GeForce 7600 GT Dual DVI (I know the web thinks it is not DVI, but VGA, but the connectors are DVI). No change to the weird behavior. The good parts continue to work, the weirdness also works, and swapping monitor positions seems to cure the issue. So, is there a setting I have missed? Given "swapping" monitor 1 and 2 on the System Settings... - Displays makes it work, just not across boot, I suspect so.

    Read the article

  • Performing a Depth First Search iteratively using async/parallel processing?

    - by Prabhu
    Here is a method that does a DFS search and returns a list of all items given a top level item id. How could I modify this to take advantage of parallel processing? Currently, the call to get the sub items is made one by one for each item in the stack. It would be nice if I could get the sub items for multiple items in the stack at the same time, and populate my return list faster. How could I do this (either using async/await or TPL, or anything else) in a thread safe manner? private async Task<IList<Item>> GetItemsAsync(string topItemId) { var items = new List<Item>(); var topItem = await GetItemAsync(topItemId); Stack<Item> stack = new Stack<Item>(); stack.Push(topItem); while (stack.Count > 0) { var item = stack.Pop(); items.Add(item); var subItems = await GetSubItemsAsync(item.SubId); foreach (var subItem in subItems) { stack.Push(subItem); } } return items; } I was thinking of something along these lines, but it's not coming together: var tasks = stack.Select(async item => { items.Add(item); var subItems = await GetSubItemsAsync(item.SubId); foreach (var subItem in subItems) { stack.Push(subItem); } }).ToList(); if (tasks.Any()) await Task.WhenAll(tasks); The language I'm using is C#.

    Read the article

  • About the new Microsoft Innovation Center (MIC) in Miami

    - by Herve Roggero
    Originally posted on: http://geekswithblogs.net/hroggero/archive/2014/08/21/about-the-new-microsoft-innovation-center-mic-in-miami.aspxLast night I attended a meeting at the new MIC in Miami, run by Blain Barton (@blainbar), Sr IT Pro Evangelist at Microsoft. The meeting was well attended and is meant to be run as a user group format in a casual setting. Many of the local Microsoft MVPs and group leaders were in attendance as well, which allows technical folks to connect with community leaders in the area. If you live in South Florida, I highly recommend to look out for future meetings at the MIC; most meetings will be about the Microsoft Azure platform, either IT Pro or Dev topics. For more information on the MIC, check out this announcement:  http://www.microsoft.com/en-us/news/press/2014/may14/05-02miamiinnovationpr.aspx. About Herve Roggero Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Blue Syntax Consulting (http://www.bluesyntaxconsulting.com). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

    Read the article

  • Why are there so few Wireless N Dual Band adapter PCI cards, only USB adapters instead?

    - by daiphoenix
    There has been several Wireless N Dual Band routers/APs out in the market for quite some time now, and there are several Wireless N Dual Band USB adapters out there. But as for PCI/PCI-X card adapters, there seems to be only one (the Linksys WMP600N). Why is that? I find it very strange. Is it because the USB adapters are easier to install, and can be used on multiple computers? But if so, why isn't it the same case with single band (2.4 Ghz) wireless N adapters? Because for these ones there as many PCI card adapters as there are USB adapters. Also, can the USB adapters, despite the lack of external antenna, offer the same level of performance as a card with external antennas?

    Read the article

  • How to make a drive partition and install Windows on it from an HP install disc

    - by Zohaib
    I bought a new HP DV6-S190SE, and I want to make multiple partitions on the hard drive. I went to HP's site and discussed with them using online chat. They said that this is not useful to make more than one partition, as when you recover your windows after some time it will erase/delete all files including new partitions, so this would not be very usefull for you. Now, if there is there any way to get rid of the existing structure and install Windows only on the C drive? First of all, how do I partition the harddrive?

    Read the article

  • Single CAS web application in a cluster

    - by Dolf Dijkstra
    Recently a customer wanted to set up a cluster of CAS nodes to be used together with WebCenter Sites. In the process of setting this up they realized that they needed to create a web application per managed server. They did not want to have this management burden but would like to have one web application deployed to multiple nodes. The reason that there is a need for a unique application per node is that the web-application contains information that needs to be unique per node, the postfix for the ticket id.  My customer would like to externalize the node specific configuration to either a specific classpath per managed server or to system properties set at startup.It turns out that the postfix for ticket ids is managed through a property host.name and that this property can be externalized.The host.name property is used in: /webapps/cas/WEB-INF/spring-configuration/uniqueIdGenerators.xmlIt is set in /webapps/cas/WEB-INF/spring-configuration/propertyFileConfigurer.xmlin a PropertyPlaceholderConfigurer.The documentation for PropertyPlaceholderConfigurer:http://static.springsource.org/spring/docs/2.0.x/api/org/springframework/beans/factory/config/PropertyPlaceholderConfigurer.htmlThis indicates that the properties defined through the PropertyPlaceHolderConfigurer can be externalized.To enable this externalization you would need to change host.properties so it is generic for all the managed servers and thus can be reused for all the managed servers: host.name=${cluster.node.id}Next step is to change the startup scripts for the managed servers and add a system property for -Dcluster.node.id=<something unique and stable>.Viola, the postfix is externalized and the web application can be shared amongst the cluster nodes.

    Read the article

  • cluster of services and restarting on package upgrade

    - by Marcin Cylke
    I'm using puppet to manage a bunch of servers. Those servers run a simple service - exposed to the world via load balancer. That service's instances are independent in that they can run on their own, are are deployed on multiple servers to increase responsiveness. Now, when I push a new package to repo and puppet catches up with it appearing there it just updates this package on all services. This results in a short downtime of entire service. Is there a way of configuring puppet to do restart the services sequentially? Or using any other kind of strategy?

    Read the article

  • Free Webinar: A faster, cheaper, better IT Department with Azure

    - by Herve Roggero
    Join me for a free Webinar on Wednesday October 17th at 1:30PM, Eastern Time. I will discuss the benefits of cloud computing with the Azure platform. There isn’t a company out there that would say “No” to reduced IT costs and unlimited scaling bandwidth. This webinar will focus on the specific benefits of the Microsoft Azure cloud platform and will convince you on the sound business rationale behind moving to the cloud. From Infrastructure as a Service (Iaas) to Platform as a Service (Paas), Azure supports quick deployments, virtual machines, native SQL Databases and much more. Topics that will be discussed: - Why use Azure for your Cloud Computing needs - Iaas and Paas Offerings - Differing project approaches to Cloud computing - How Azure’s agility and reduced costs lead to better solutions Attendees of this webinar will also be eligible to receive the following: Free Two Hour Consultation which can include: - Review of Your Cloud Strategy - Cloud Roadmap Review - Review of Data-mart strategies - Review of Mobility Strategies Click Here to Register Now. About Herve Roggero Hervé Roggero, Azure MVP, is the founder of Blue Syntax Consulting, a company specialized in cloud computing products and services. Hervé's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Hervé holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Hervé is the co-author of "PRO SQL Azure" from Apress. For more information, visit www.bluesyntax.net.

    Read the article

  • Fixing/Extending Standard Windows File Open/Save Dialog [closed]

    - by scunliffe
    Possible Duplicate: Change left side link of the Save As Dialog for a DropBox one? Almost every time I use the standard Windows (XP) File Open/Save Dialog I get frustrated in how long it takes me to navigate to where I want to go. :-( (I won't even get into the MS Office dialog that makes things even worse) This is the dialog I'm referring to (with some notes) Notes: Wouldn't a Drive list be handy in here? C:\, D:\, E:\, etc. What about a breadcrumb URI? (the magenta list of links) Why isn't Program Files one of the icons on the left? (green) I'm always going in there for something Why can't I type "../../../" to navigate up multiple directories in the File name box? (blue) There has got to be some utilities out there that can "hijack" or "overwrite" this core windows dialog to provide a much better set of options. I'm looking for any/all solutions to help fix this dialog.

    Read the article

  • Find Files That Contain Both Words in Notepad++

    - by SethO
    In Notepad++ (v5.9), I want to search for files which contain two words. For example, I would like to find all text files in a directory that have both Alpha and Bravo in the file. They may not be next to each other and they may have multiple occurrences of either. I just want to find the files that have at least one instance of each. Is there a way to structure this search without resorting to Regular Expressions? Thanks for the advice.

    Read the article

  • Cat all files in a directory, with a specific file at the beginning an end...?

    - by Aeisor
    Is there a way to cat all files in a given directory, but with a particular file at the beginning and end? For example, say I have: file1.js, file2.js, file3.js, file4.js, file5.js -- Effectively I would like to cat file2.js file*.js file3.js > /var/www/output.js I've tried a few variations of these find ! -name "file2.js" ! -name "file3.js" -type f -exec cat file2.js {} file3.js > /var/www/js/output.js \; find ! -name "file2.js" ! -name "file3.js" -type f | xargs -I files cat file2.js files file3.js > /var/www/output.js but the best I can get out of it is file2.js added before and file3.js added after all other files (multiple times) I know I could specify the files in the order I wanted manually, but this is not maintainable (I'm expecting, potentially 100 files). I have looked through man cat, as well as a handful of websites devoted to xargs, find and cat to no avail. Thanks in advance.

    Read the article

  • Using a random string to authenticate HMAC?

    - by mrwooster
    I am designing a simple webservice and want to use HMAC for authentication to the service. For the purpose of this question we have: a web service at example.com a secret key shared between a user and the server [K] a consumer ID which is known to the user and the server (but is not necessarily secret) [D] a message which we wish to send to the server [M] The standard HMAC implementation would involve using the secret key [K] and the message [M] to create the hash [H], but I am running into issues with this. The message [M] can be quite long and tends to be read from a file. I have found its very difficult to produce a correct hash consistently across multiple operating systems and programming languages because of hidden characters which make it into various file formats. This is of course bad implementation on the client side (100%), but I would like this webservice to be easily accessible and not have trouble with different file formats. I was thinking of an alternative, which would allow the use a short (5-10 char) random string [R] rather than the message for autentication, e.g. H = HMAC(K,R) The user then passes the random string to the server and the server checks the HMAC server side (using random string + shared secret). As far as I can see, this produces the following issues: There is no message integrity - this is ok message integrity is not important for this service A user could re-use the hash with a different message - I can see 2 ways around this Combine the random string with a timestamp so the hash is only valid for a set period of time Only allow each random string to be used once Since the client is in control of the random string, it is easier to look for collisions I should point out that the principle reason for authentication is to implement rate limiting on the API service. There is zero need for message integrity, and its not a big deal if someone can forge a single request (but it is if they can forge a very large number very quickly). I know that the correct answer is to make sure the message [M] is the same on all platforms/languages before hashing it. But, taking that out of the equation, is the above proposal an acceptable 2nd best?

    Read the article

  • Iterating over resources in puppet templates

    - by daveg
    So I've got a puppet manifest, with multiple resources class foo { Custom::Resource {'resource1': attr1 => 'val1', attr2 => 'val2', } Custom::Resource {'resource2': attr1 => 'val3', attr2 => 'val4', } Custom::Resource {'resource3': attr1 => 'val5', attr2 => 'val6', } } If I wanted to loop over the Custom::Resource resource names in an .erb template that are defined in class foo, how do I access them? So if I wanted to write out a template that looked like this: ThisLine = resource1 ThisLine = resource2 ThisLine = resource3

    Read the article

  • How can I get data off of a damaged thumb drive?

    - by Lord Torgamus
    A guy I work with just came by to ask me about a damaged thumb drive/USB flash drive. Apparently his son dropped it on a hard surface and it won't power up anymore. They've tried plugging it into multiple machines without success, even though each port they tried was able to power other USB devices. He knows it's not a lost cause because a local tech store is offering to recover the data for $500, but he says they're not worth that much. I figured someone on SU would have an idea about this; he doesn't care about using the drive in the future, just wants to salvage a few files that would be a pain to recreate. Is this possible without advanced equipment, and if so, how? He said he already tried the advice on the Internet about typing in different drive letters and such, but that failed because there was no power going to the drive. He also said that he opened the case up at one point, but I'm not sure what, if anything, he did inside.

    Read the article

  • CD-R suddenly unreadable?

    - by TheD
    I have a CD-R which has worked fine for a good while. All of a sudden, Windows can no longer read its data. It's not scratched, it's clean and Windows can detect the disk and usage of space on it. But whenever I attempt to access the data stored: Explorer crashes Or after 5-10 mins of trying to read it, it opens up and just shows a desktop.ini file in Explorer. This happens on multiple machines. Any ideas? Is there a way to recover the data? For example - some sector by sector recovery software if any exist for CD's?

    Read the article

  • Packaging MATLAB (or, more generally, a large binary, proprietary piece of software)

    - by nfirvine
    I'm trying to package MATLAB for internal distribution, but this could apply to any piece of software with the same architecture. In fact, I'm packaging multiple releases of MATLAB to be installed concurrently. Key things Very large installation size (~4 GB) Composed of a core, and several plugins (toolboxes) Initially, I created a single "source" package (matlab2011b) that builds several .debs (mainly matlab2011b-core and matlab2011b-toolbox-* for each toolbox). The control file is just the standard all: dh $@ There is no Makefile; only copying files. I use a number of debian/*.install files to specify files to copy from a copy of an installation to /usr/lib/. The problem is, every time I build the thing (say, to make a correction to the core package), it recopies every file listed in the *.install file to e.g debian/$packagename/usr/ (the build phase), and then has to bundle that into a .deb file. It takes a long time, on the order of hours, and is doing a lot of extra work. So my questions are: Can you make dh_install do a hardlink copy (like cp -l) to save time? (AFAICT from the man page, no.) Maybe I should just get it to do this in the Makefile? (That's gonna b e big Makefile.) Can you make debuild only rebuild .debs that need rebuilding? Or specify which .debs to rebuild? Is my approach completely stupid? Should I break each of the toolboxes into its own source package too? (I'll have to do some silly templating or something, because there's hundreds of them. :/)

    Read the article

  • An equivalent of IceCast but for Live Video Streaming ?

    - by Kedare
    Hello, I am looking for a solution to Stream live video like that : A camera/webcam/video output ---> Stream server ---> Clients And if possible multiple Stream Servers like this (like IceCast): A camera/webcam/video output --> Master Stream server +---> Slave Stream Server ---> Clients | `--> Clients | `--> Slave Stream Server ---> Clients `--> Clients The clients will be in flash, so I think RTMP should be a good protocol, I've heard of Red5, is it good for that ? Does it scale ? I would like to get statistics (Amount of clients, Bandwidth, etc), is it possible with red5 ? Do you know any other good solution to do that ? (Only free and if possible Open Source) Thank you !

    Read the article

  • Distributed website server redundancy

    - by Keith Lion
    Assume a website infrastructure is very complicated and is fully distributed (probably like most large web companies). Am I right in thinking that although there are all these extra web servers to handle multiple client requests, there is still a single "machine" whereby users must enter? I am guessing this machine will be the one physically associated to the IP address? I ask because I need to know whether, in places where distributed systems exist, there is still a single point of failure- usually the control node or, in this example, the machine connected to the public internet? Surely there cannot be two machines connected to the internet, as they would have to have different IP addresses? This "machine" may not be a server per se, but maybe it is a piece of cisco equipment. I just need to know whether, in the real world, these distributed systems still have a particular section where they depend on the integrity of one electronic device?

    Read the article

  • How to control messages to the same port from different emitters?

    - by Alex In Paris
    Scene: A company has many factories X, each emits a message to the same receive port in a Biztalk server Y; if all messages are processed without much delay, each will trigger an outgoing message to another system Z. Problem: Sometimes a factory loses its connection for a half-day or more and, when the connection is reestablished, thousands of messages get emitted. Now, the messages still get processed well by Y (Biztalk can easily handle the load) but system Z can't handle the flood and may lock up and severely delay the processing of all other messages from the other X. What is the solution? Creating multiple receive locations that permits us to pause one X or another would lose us information if the factory isn't smart enough to know whether the message was received or not. What is the basic pattern to apply in Biztalk for this problem? Would some throttling parameters help to limit the flow from any one X? Or are their techniques on the end part of Y which I should use instead ? I would prefer this last one since I can be confident that the message box will remember any failures, which could then be resumed.

    Read the article

< Previous Page | 626 627 628 629 630 631 632 633 634 635 636 637  | Next Page >