Search Results

Search found 24301 results on 973 pages for 'execution process mfg'.

Page 592/973 | < Previous Page | 588 589 590 591 592 593 594 595 596 597 598 599  | Next Page >

  • install bluetooth in samsung R430, windows 7

    - by voodoomsr
    how can i install the bluetooth driver in this laptop in windows 7. The installation process tells me that i need to activate the bluetooth device to continue, but how can i activate if till that moment the device doesn't exist. There isn't any button or switch to activate the device manually. Edited: it's the np-r430 model. I read somewhere (a users forum, not official info) that for Chile and Argentina this models doesn't have built in the bluetooth device, but the US models have it. Maybe for some political and weird reason they send to south american laptops without that. The link of my notebook's specifications in an AR domain: np-r40

    Read the article

  • How do I modify these VPN connection settings for Xfce?

    - by Dave M G
    I have signed up for a VPN (Virtual Private Network) service, and I configured it for use on my computer that runs Gnome Classic with the following instructions: In Terminal, install openvpn packages with sudo apt-get install network-manager-openvpn. 1. Restart the network manager with sudo restart network-manager 2. Run sudo wget https://www.xxxxxxx.com/ovpnconfigure.zip 3. Extract the files from the zip with unzip ovpnconfigure.zip. 4. Move cert.crt to /etc/openvpn 5. Open the Network Manager on the menu bar. 6. Choose add and select the OpenVPN connection type, and click Create. 7. Enter Private Internet Access SSL for the Connection Name. 8. Enter xxxxxx.xxxxxxxx.com for the Gateway 9. Select Password and enter your login credentials. 10. Browse and select the CA Certificate we saved in Step 3. 11. Choose Advanced and enable LZO Compression. 12. Apply and exit. 13. Connect using the Network Manager. It worked, but now I want to set up access to the same VPN service on another machine that runs Mythbuntu, which uses Xfce as its desktop manager. So every point from 5 on doesn't apply. How can I modify the above instructions so that I can get my VPN service working with Xfce. As a further note, while I can access the Xfce desktop directly if I need to, it's more convenient for me to access it via the command line and SSH from on of my other computers. A command line process would be ideal. (I looked for this, and found instructions only for PPTP access, whereas I need OpenVPN.)

    Read the article

  • Problem with dpkg-preconfigure, how to correct?

    - by Eric Wilson
    I was trying to install TeamViewer, and I followed the instructions here even though they specify 11.10 instead of 12.04 (what I'm running). In particular, I executed. $ wget http://www.teamviewer.com/download/teamviewer_linux.deb $ sudo dpkg -i teamviewer_linux.deb The dpkg command failed, and after this point my packaging system has been broken. The software center instructs me to try: $ sudo apt-get -f install which leads to Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following packages will be REMOVED: teamviewer7:i386 0 upgraded, 0 newly installed, 1 to remove and 17 not upgraded. 9 not fully installed or removed. Need to get 89.0 kB of archives. After this operation, 81.9 MB disk space will be freed. Do you want to continue [Y/n]? y Get:1 http://us.archive.ubuntu.com/ubuntu/ precise/main dash amd64 0.5.7-2ubuntu2 [89.0 kB] Fetched 89.0 kB in 1s (83.9 kB/s) E: Sub-process /usr/sbin/dpkg-preconfigure --apt || true returned an error code (100) E: Failure running script /usr/sbin/dpkg-preconfigure --apt || true At this point I'm stumped.

    Read the article

  • Managing products on a an ecommerce site [closed]

    - by John
    I've had a site that sells widgets for many years. I do not inventory my widgets, but the cost of adding them to the site and makings sure the site is current is becoming cost prohibitive. Here are the facts: I sell a single class of widget. I have about 50,000 widgets on my site. I have about 100 vendors that create and dropship the products when they get an order from me via email. Each vendor carries from 50 to 5000 types of widgets. Vendors all have websites with images and descriptions of their products. Each widget is produced in limited supply and usually sell out in 1-5 years. Prices of the widget often go up, sometimes more than 50% before they sell out. My vendors aren't very tech sophisticated. They have websites with their products, but most can't supply an api or database dump. Their websites usually display retail prices to the public, but I login or refer to a price list (usually excel) for wholesale prices. As it stands now, I hire local people to add and describe each widget to our website. It usually takes a person 4 minutes to add a widget to the site. This doesn't include moving to a new vendor. I feel like the upload/edit process is as good as it can get via a form/website. The problem is that it is getting very expensive to upload and keep the widget inventory current. I often get orders for something after it's sold out from the vendor or the price is wrong. This seems like it would be a problem in many industries. Can anyone suggest the cheapest way to upload inventory and ensure prices are current from my vendors? I'm assuming it will involve outsourcing, but I would like ideas on how to setup the compensation model.

    Read the article

  • How to determine the amount to spend per phrase on Adwords research?

    - by Anonymous -
    My company would like to start a PPC advertising campaign. Whilst I understand the concept and how to set everything up from a technical point of view, this is something I've never done before. Logically, we'd like to test out a wide range of keywords that we think would lead to conversions, which we've put together through brainstorming and with some help from Google's External Keyword Tool. Sub-question whilst I remember - am I correct in thinking that in Google's keyword tool, keywords that we think will perform well that have a low competition yet high monthly searches are good since there will be less advertisers, meaning our bid per click will be less? Is there a common benchmark or process of doing a round of tests with keywords? Should we wait for 100 clicks on each keyword, see which ones have lead to the most sales (or rather, sales that are sustainable with the cost per click of that keyword), then drop the ones which aren't converting and put that budget onto the converting keywords? We realistically have a few hundred keywords/phrases we would like to test, but spending $100 per keyword/phrase is going to work out as quite an expensive test. It would be nice to be able to spend $5-10 per phrase, but I don't think the sample size would be great enough to determine anything usefully reliable. Another approach might be to setup all the keywords, and those that bring the most sales within x hours/days would be the ones we use. What is the common procedure with things like this? I know there are a plethora of companies that specialize in exactly this, but this is something we anticipate doing a lot in the future, so it would make sense to do it in house if at all possible.

    Read the article

  • Defining formula through user interface in user form

    - by BriskLabs Pakistan
    I am a student and developing a simple assignment - windows form application in visual studio 2010. The application is suppose to construct formulas as per user requirement. The process: It has to pick data from columns of Microsoft Access database and the user should be able to pick the data by column name like we do in a drop down menu. and create reusable formulas in it ( configure it once and can change it again). followings are column titles from database that can be picked for example. e.g Col -1 : Marks in Maths Col -2 : Total Marks in Maths Col -3 : Marks in science Col -4 : Total marks in science Finally we should be able to construct any formula in the UI like (Col 1 + Col 3 ) / ( col 2 + col 4) = Formula 1 once this is formula is set saved and a name is assigned to it by user. he/she can use the formula and results shall appear in a window below. i.e He would be able to calculate his desired figures (formula) by only manipulating underlying data on the UI layer....choose the data for a period and apply the formula and get the answer Problem: It looks like I have to create an app where rules are set through UI....... this means no stored procedures are required in SQL.... please suggest the right approach.

    Read the article

  • TechEd 2010 Day Four: Learning how to help others learn

    - by BuckWoody
    I do quite a few presentations, and teach at the University of Washington, and also teach other classes. But I'm always learning from others how to help others learn. At events like TechEd I have access to some of the best speakers around, so I try to find out what they do that works. I attended a great session by allen White, in which he demonstrated a set of PowerShell scripts. He said that Dan Jones of the Microsoft Manageability team told him while he demonstrated a script he needed to provide some visual way to represent the process. Allen used one of the oldest visualizations around - a flowchart. It was the first time I'd seen one used to illustrate a PowerShell script, and it was very effective. I'm totally stealing the idea. All of us are teachers - we help others on our team understand what we're up to. Make sure you make notes for what you find effective in dealing with you, and then meld that into your own way of teaching. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Advice for migrating email server

    - by Chris Adams
    Hi there, I'm planning to migrate a Zimbra server with about 200gb of data from a server hosted in an office into a datacentre, to increase uptime (we've had a couple of outages when our network here started flaking out, and we have people in other countries relying on this server too). However, I'm not sure how best to migrate the data into the data centre without rendering the connection unusable during office hours, because there's far too much to send in over night over the two meg upstream connection we have here. I'm familiar with using tools like nice to stop a long running process degrading machine performance - is there a simple way to throttle a connection between office hours, so the long running transfer doesn't block the pipe, but then opens up outside of office hours to make the most of the bandwidth? I'm aware the alternative here is to simply mail a hard drive to the data centre, but I'd like to avoid doing that if I could. We're using Centos Linux for our servers, in the office and the datacentre, so extra points for an open source linux answer.

    Read the article

  • Getting iTunes to play third party AAC files

    - by Redmastif
    I have a library filled with some old MP3 files and I'm in the process of changing them all to AAC for the better sound quality. Obviously I can't just create AAC versions of the files I already have because they would sound worse (lossy compression to converted to more lossy compression), so I'm going to their source and downloading them in a lossless form and using a third party to make them into AAC. Apparently iTunes will not handle AAC files that aren't made with iTunes. Is there a way around this? I've looked at third party programs and would be willing to use them, but since they all require the iTunes/iPod/iEverything driver, I don't know if they would still prevent my files or not. Also before you jump on my back about pirating, these files are from old CDs that I lost years ago. I paid for them. Thanks.

    Read the article

  • How do I make a Windows virtual machine replicate to another datacenter/cloud?

    - by zippy
    We have a Windows 2008 VM running IIS and SQL Server Express (it's an all-in-one web application). We need to have another copy at our secondary datacenter site. What is the best way to do this? It doesn't have to be running all the time but it has to have almost the latest copy of the current VM. I took a look at VMWare Fault Tolerance and after the heart attack at the price I starting looking for another solution. If need be I wouldn't mind copying it over to a cloud VM provider, if I can find one that lets me copy my own VMs up and start them up without any conversion process.

    Read the article

  • Congratulations to 2012 Innovation Award winners in BPM category

    - by Manoj Das
    Last year many of our customers went live on BPM 11g. It is my extreme pleasure to congratulate two of them – Amadeus and Navistar – for being awarded Oracle Fusion Middleware Innovation Award at Oracle OpenWorld 2012. We invited our customers to submit their most innovative BPM implementations that have delivered substantiated value to them. This year we saw more than 20 submissions from our customers seeing significant business value from their live BPM 11g deployments. The submissions came from across the world, spanning various industry verticals including manufacturing, healthcare, logistics, Hi-Tech, Public Sector, Education and covering many process usage patterns. Award submissions were evaluated based on the uniqueness of their business case, business benefits, level of impact relative to the size of the organization, complexity and magnitude of implementation, and the originality of architecture. Amadeus Team Receiving Innovation Award from Hasan Rizvi Congratulations to Amadeus and Navistar and their teams on being recognized from among some very strong submissions and more importantly for the business value delivered. It is an honor to be part of your success and to play a small role in the innovation you drive. Navistar is a leading truck manufacturing company which produces International® brand commercial and military trucks, MaxxForce® brand diesel engines, IC Bus™ brand school and commercial buses, and Navistar RV brands of recreational vehicles. The company also provides truck and diesel engine service parts. Amadeus is a leading transaction processor for the global travel and tourism industry, providing transaction processing power and technology solutions to both travellers and travel providers. Both Navistar and Amadeus have leveraged Oracle BPM Suite to improve visibility into their business and made their business more agile and efficient. We congratulate them again and wish them continued success in their business and future BPM initiatives.

    Read the article

  • Lubuntu customized cdrom installation crashes

    - by SBarve
    I have created customized live cd of lubuntu and it is customized using uck. After burning the CDROM and using the cdrom for installation of same CD the installation works fine on HP desktop but it fails on Dell desktop. Here is the error. Can someone help to sort out this error. We are sorry; the installer crashed. After you close this window, we will allow you to file a bug report using the integrated bug reporting tool. This will gather information about your system and your installation process. The details will be sent to our bug tracker and a developer will attend to the problem as soon as possible. Traceback (most recent call last): File "/usr/lib/ubiquity/plugins/ubi-timezone.py", line 173, in geoname-cb for result in json.loads (message.response_body.data): File "/usr/lib/python2.7/json/_init_.py", line 326, in loads return_default_decoder.decode(s) File "/usr/lib/python2.7/json/decoder.py", line 366, in decode obj, end=self.raw_decode(S, idx=_w(S,0).end ()) File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode raise ValueError ("No JSON object could be decoded") ValueError: No JSON object could be decoded.

    Read the article

  • Oracle Logical Standby redo generation

    - by DCookie
    Oracle 10.2.0.4 database with a logical standby on Win2K3. Recently a rather large delete operation was carried out on the production instance. I'm experiencing difficulty with the logical standby, in that it gets a couple of hundred (58M size) archive logs into the operation and the apply process fails with an out-of-memory error. Unfortunately, every time it fails it has to restart the apply from the beginning of the transaction. This is taking a couple of days each time. Anyway, in trying to resolve this problem, I've noticed that each archive log from the production system generates 5 or 6 log switches on the standby. I don't understand why this should be. Anyone have any ideas? A related question that I've not found the answer for: does anyone know if the logical standby must be running in archivelog mode? I really don't have a need to keep the logs.

    Read the article

  • How to determine the best byte size for the dd command

    - by James
    I know that doing a dd if=/dev/hda of=/dev/hdb does a deep hard drive copy. I've heard that people have been able to speed up the process by increasing the number of bytes that are read and written at a time (512) with the "bs" option. People have suggested that the optimal byte size is due to sector size. I personally think it would have something to do with the amount of cache that the hard drive has. My question is: What determines the ideal byte size for copying from a hard drive? and Why does that determine the ideal byte size?

    Read the article

  • Download databasename.bak file

    - by Jordon
    I have downloaded databasename.bak file from my hosting company, when i tried to restore that DB file in SQL server 2008 it is keep on giving me following error. The media family on device 'C:\go4sharepoint_1384_8481.bak' is incorrectly formed. SQL Server cannot process this media family. RESTORE HEADERONLY is terminating abnormally. (Microsoft SQL Server, Error: 3241) According to this error and from following link http://www.sqlcoffee.com/Troubleshooting047.htm It is clear that either file i am downloading is corrupt or it is getting corrupted on the way? Any idea, why I am keep on receiving this error? I tried almost all ways but unable to fix this problem, please help me.

    Read the article

  • Is realtime validation of username good or bad?

    - by iamserious
    I have a simple form for the user to sign up to my site; with email, username and password fields. We are now trying to implement an ajax validation so the user doesn't have to post the form to find out if the username is already taken. I can do this either on keyup event or on text blur event. My question is, which of these is really the best way to do? Keyup From the user POV, it would be good if the validation is done as and when they are typing, (on key up event) - of course, I am waiting for half a second to see if the user stops typing before firing off the request, and user can make any adjustments immediately. But this means I am sending way more requests than if I validated the username on Blur event. Blur The number of requests will be much lower when the validation is done on blur event, But this means the user has to actually go away from the textbox, look at the validation result, and if necessary go back to it to make any changes and repeat the whole process until he gets it right. I had a quick look at google, tumblr, twitter and no one actually does username validations on keyup events, (heck, tubmlr waits for the form to be posted) but I can swear I have seen keyup validations in a lot of places too. So, coming back to the question, will keyup validations be too many for server, is it an unnecessary overhead? or is it worth taking these hits to give user a better experience? ps: all my regex validations etc are already done on javascript and only when it passes all these other criteria does it send a request to server to check if a username already exists. (And the server is doing a select count(1) from user where username = '' - nothing substantial, but still enough to occupy some resource) pps: I'm on asp.net, MS SQL stack., if that matters.

    Read the article

  • Update a war on Tomcat start up

    - by pater
    I want to setup an update process for an application running on Tomcat. The server which hosts tomcat is only open during working hours (it is an intraner application for a small company). I was thinking that I could upload the new war to the server and set up "something" to run on the next server boot. This something could be a bat file that will be executed on server start up but before the start up of the Tomcat service and it will delete the old war and its exploded folder. When I update manually the war I also delete the work folder of Tomcat (just to be sure). I know about hot deployment but I do not consider it an option since I am not very sure for the implications it might have on the users current working sessions. Is there a way to run such a bat file before Tomcat start up or an alternative way to do this update? Tomcat version isn't an issue. Now is running Tomcat 6 but I can upgrade to version 7 if needed.

    Read the article

  • tunneling x11vnc through ssh on a non standard port to ubuntu computer tightvnc

    - by user72372
    I have been stuck with setting up my virtual desktop on my ubuntu laptop. I am running ubuntu to ubuntu with x11vnc I start the process on my laptop as follows: " ssh -L5904:localhost:5900 -p Port remoteuser@remoteip." That command works, then I start x11vnc server, "x11vnc -noncache -once -shared -rfbauth ~/.vnc/passwd." This command works and starts connection. Then I open another window on my laptop and type, export VNC_VIA_CMD='/usr/bin/ssh -2 -c aes128-cbc -x -p Port -l User -f -: %L:%H:%R %G sleep 20' (not sure if works). then I type, vncviewer -endcodings Tight -depth 8 -quality 1 -via IPofremotemachine -u remoteuser localhost:01. The first time it work but from now on it just gives me the vncviewer -help screen everytime. I type in the password for my remote machine and then shows -help screen for vncviewer. I think the problem is with Tightvnc viewer but don't know what. Please help. I got some info on www.vanemery.com/Linux/VNC/vnc-over-ssh.html?.

    Read the article

  • Best technique for reusing a Windows system image across configurations

    - by Martin Wiboe
    We are a small company that provides solutions for ventilation systems. Part of the solution is a "controller" which communicates with the ventilation equipment. These controllers are simply Dell computers that come with our Windows 7 system image on them and sometimes some special hardware. We typically do a batch of 10 controllers at a time. We have been using Norton Ghost to apply the system image, but this process breaks because Dell changes the system configuration often, and our Windows image now does not contain the correct drivers. This is especially a problem when they change the RAID controller. To improve this, I see 2 options: use some kind of virtualization and install a hypervisor on each PC. This would solve the driver problem, but probably cause trouble with our special hardware. use some method of adding the proper drivers to our Windows image in offline mode. I haven't got much experience in either of these approaches. How would you solve our problem?

    Read the article

  • Herding Cats - That's My Job....

    - by user709270
    Written by Mike Schmitz - Sr. Director, Program Management Oracle JD Edwards  I remember seeing a super bowl commercial several years ago showing some well dressed people on the African savanna herding cats. I remember turning to the people I was watching the game with and telling them, “You just watched my job description”. Releasing software is a multi-facetted undertaking. In addition to making sure the code changes are complete, you also need to make sure the other key parts of a release are ready. For example when you have a question about the software, will the person on the other end of the phone be ready to answer your question? If you need training on that cool new piece of functionality, will there be an online training course ready for you to review? If you want to read about how the software is supposed to function, is there a user manual available? Putting all the release pieces together so they are available at the same time is what the JD Edwards Program Management team does. It is my team’s job to work with all the different functional teams so when a release is made generally available you have all the things you need to be successful. The JD Edwards Program Management team uses an internal planning tool called the Release Process Model (RPM) to ensure all deliverables are accounted for in a release. The RPM makes sure all the release deliverables are ready at the correct time and in the correct format. The RPM really helps all the functional teams in JD Edwards know what release deliverables they are accountable for and when they are to be delivered. It is my team’s job to make sure everyone understands what they need to do and when they need to deliver. We then make sure they are all on track to deliver on-time and in the right format. It is just that some days this feels like herding cats.

    Read the article

  • Offshoring: does it ever work?

    - by DanSingerman
    I know there has been a fair amount of discussion on here about outsourcing/offshoring, and the general opinion seems to be that at best it is difficult, and at worst it fails. I have direct experience of offshoring myself; a previous company where I was a dev manager wanted to send some development offshore, and we ran a pilot scheme to see how well it would work. Of course it was a complete failure, although it is not completely clear to me whether this was down to the offshore devs being less talented, the process, or other factors (no doubt it was really a combination). I can see as a business how offshoring looks attractive (much lower day rate), but as far as I can see, the only way it could possibly work is if you do exceptionally detailed design up front, with incredibly detailed specifications; and by the time you have invested in producing that, you have probably spent as nearly as much as if you had written the actual code locally (which I think is an instance of No Silver Bullet) So, what I want to know is, does anyone here have any experience of offshoring actually working ever? Especially if there are any success stories of it working in a semi-agile way? I know there are developers here from all over the World; has anyone worked on an offshore project they consider successful?

    Read the article

  • How should I troubleshoot a problematic wireless connection on Linux?

    - by Gearoid Murphy
    I recently purchased a netgear 150 usb wireless dongle for use with my 11.10 Xubuntu amd64 system. Using the network-manager interface, I can see local wireless networks and enter the authentication details for my local wireless lan. Unfortunately, the connection does not seem to work, I keep getting notifications that my wireless has disconnected (but none indicating that I've connected). When I examine syslog, it seems to indicate that I've successfully associated with the wireless switch and that dhcp has successfully acquired an ip address but the log shows that the dhcp process keeps sending requests, eventually dropping the connection. 'ifconfig wlan0' never shows the dhcp address logged in syslog. I suspect that the problem lies with the usb dongle, my configuration or the wireless switch but I am not certain how to isolate the problem, can anyone provide some insight on how I should go about homing in on the cause of this problem or verifying the functionality of the individual components, thanks.

    Read the article

  • Is it possible to make Ctrl+C as responsive as Ctrl+Break in the Windows 7 console?

    - by Peter Graham
    Is it possible to make Ctrl+C act like Ctrl+Break in the Windows 7 cmd.exe console? By default Ctrl+C seems to only send a signal the next time the input buffer is read, where Ctrl+Break sends a signal immediately. This makes Ctrl+C useless for ending processes because when I want to end a process I want to end it immediately. I'm using Ctrl+Break for now but it's far harder to type. It looks like in DOS you can add BREAK=ON to CONFIG.SYS to achieve this, but not in Windows 7?

    Read the article

  • Upgrading from php 5.3 to php 5.4 with Macport

    - by dr.stonyhills
    PHP5.4 has been available for sometime now and Macport recently caught up with the release of port php54 but the process of upgrading is not as clear as possible. Even worst for those who are new to maintaining multiple versions of PHP on the same machine. I am keen on trying out some of the new features in PHP5.4 like traits, new array form etc but falling back on to php5.3 for other compatibility stuff. So i sudo port install php5+ (all the variants, apache2 etc) Then i tell it what PHP port to use as default sudo port select --set php php54 Check what version of PHP is active in the terminal using php -v outputs php 5.4.3. But i seem to be having issues with choosing the right non cli version as in the version of the module run by apache etc is still php5.3.12. Do i have to change the reference to the libphp5 in apache httpd.conf? Any advice on the right workflow for switching between php version on macport greatly appreciated!

    Read the article

  • .NET Dependency Management Systems

    - by StriplingWarrior
    I have some .NET projects that are starting to get large enough to merit looking into Dependency Management solutions, so we don't have to copy binaries from one project to another. Here's what I've found so far: NPanday is based on a port of Maven. I can't tell how recently it was worked on, but the last release was in May 2011. NuGet seems to be under active development, and it appears to have support directly from Microsoft. Some people complained that it "only addresses dependency resolution," but I don't know what else it should address, or whether it has added more features since that point. It does appear to have recently added the ability to import binaries as part of the build process so we don't have to commit them to our repositories. Refix appears to still be in Beta, after having received no attention since Sept 2011. Would somebody with recent experience using any of these dependency management tools (or any others that work well) share your experience? Is NuGet mature enough to use it for dependency management? If not, what does it lack?

    Read the article

< Previous Page | 588 589 590 591 592 593 594 595 596 597 598 599  | Next Page >