Search Results

Search found 25039 results on 1002 pages for 'machine learning'.

Page 641/1002 | < Previous Page | 637 638 639 640 641 642 643 644 645 646 647 648  | Next Page >

  • Windows Security Center Service is missing

    - by TheTub
    I am trying to fix a Windows 7 machine here that has been infected with all kinds of Malware. I have removed all of them as far as I can see but I am stumped by one last task. One little bugger managed to remove the Windows Security Center service from the list of Windows services. So I cannot start it or set to automatically start. At the moment I cannot get the Windows firewall to turn on or any anti-virus software. The security center shows the following image when I try: Does anyone know how to add this back to the list of services so I may set it to start. I don't have a backup of the registry for this computer (it's not mine). Many thanks TT

    Read the article

  • Where is the bottleneck?

    - by jsymon
    There is a limit on connections somewhere along the line here... On a windows server 2008 machine, each request to a url running on localhost takes ~3 seconds to complete. This is fine and normal for the url. However, if i open the same localhost url in about 10 tabs, and set them to reload all at the same time, they finish sequentially, 3 seconds after each other. Meaning the last tab has taken 30 seconds to load (3s x 10). What is especially odd is that firebug reports each page as taking 3seconds to load. Another point to add is that the status bar just sits at 'done' for the last tab until 3 seconds before completing, where it then changes to 'waiting for localhost'. I am praying there is some connection limit somewhere otherwise this would be a disaster if more than one user ever visited the site at a time! Maybe a limit or something where one pc cant make more than 2 simultaneous requests to a url at a given time?

    Read the article

  • securing source code with bitlocker

    - by Daniel Powell
    We need to deploy a web based application at a client site where it will be within their local intranet. Part of our requirement is to provide some basic security to protect our IP. I realise that nothings a 100% guaranteed fix but we are just looking to make it a bit harder for most people. The server will be running server 2008 and I was considering using bitlocker as a cheap and nasty way to protect it. From what I understand assuming the mobo supports it we can use the Transparent bitlocker mode and this means that moving the hdd to another pc will mean the hdd will be unreadable in that machine baring some sort of cold boot attack to steal the encryption keys. Is this assumption correct and in the case that the motherboard or any other component fails in the pc and we need to replace it do we lose access to our data or is there a way to unencrypt it (obviously accessible to only our company) EDIT: we do have legal documents that cover this and we will be locking the pc physically and the client will not have access to the pc (windows login) other than via the website we host on it

    Read the article

  • White Paper on Analysis Services Tabular Large-scale Solution #ssas #tabular

    - by Marco Russo (SQLBI)
    Since the first beta of Analysis Services 2012, I worked with many companies designing and implementing solutions based on Analysis Services Tabular. I am glad that Microsoft published a white paper about a case-study using one of these scenarios: An Analysis Services Case Study: Using Tabular Models in a Large-scale Commercial Solution. Alberto Ferrari is the author of the white paper and many people contributed to it. The final result is a very technical document based on a case study, which provides a level of detail that I don’t see often in other case studies (which are usually more marketing-oriented). This white paper has the following structure: Requirements (data model, capacity planning, client tool) Options considered (SQL Server Columnstore Indexes, SSAS Multidimensional, SSAS Tabular) Data Model optimizations (memory compression, query performance, scalability) Partitioning and Processing strategy for near real-time latency Hardware selection (NUMA analysis, Azure VM tests) Scalability tests (estimation of maximum users per node) If you are in charge of evaluating Tabular as analytical engine, or if you have to design your solution based on Tabular, this white paper is a must read. But if you just want to increase your knowledge of Analysis Services, you will find a lot of useful technical information. That said, my favorite quote of the document is the following one, funny but true: […] After several trials, the clear winner was a video gaming machine that one guy on the team used at home. That computer outperformed any available server, running twice as fast as the server-class machines we had in house. At that point, it was clear that the criteria for choosing the server would have to be expanded a bit, simply because it would have been impossible to convince the boss to build a cluster of gaming machines and trust it to serve our customers.  But, honestly, if a business has the flexibility to buy gaming machines (assuming the machines can handle capacity) – do this. Owen Graupman, inContact I want to write a longer discussion about how companies are adopting Tabular in scenarios where it is the hidden engine of a more complex solution (and not the classical “BI system”), because it is more frequent than you might expect (and has several advantages over many alternative approaches).

    Read the article

  • Performance difference between MacBook Pro (2.8 GHz) vs Air (1.7 GHz)?

    - by jonathanconway
    I'm comparing these two Apple laptops: MacBook Pro (13", 2011 model): 2.8GHz dual-core Intel Core i7 processor with 4MB shared L3 cache 4GB (two 2GB SO-DIMMs) of 1333MHz DDR3 SDRAM AMD Radeon HD 6770M graphics processor with 1GB of GDDR5 memory on 2.4GHz configuration MacBook Air (13", 2011 model): 1.7GHz dual-core Intel Core i5 with 3MB shared L3 cache 4GB of 1333MHz DDR3 onboard memory Intel HD Graphics 3000 processor with 384MB of DDR3 SDRAM shared with main memory There's definitely a gap between them in terms of CPU speed and graphics, but what practical difference would this make on a day-to-day basis? On the one hand, I love the sleek, thin appearance of the Air. On the other hand, I don't want a machine that's going to be dog-slow when doing tasks such as running Virtual Machines, dual-booting to Windows and running multiple instances of Visual Studio, and maybe some light gaming. Is there going to be a major difference that makes the MacBook Pro a more attractive purchase?

    Read the article

  • IIS displaying page differently when localhost is used in URL vs. hostname

    - by maik
    I'm having (yet another) strange problem with IIS. When viewing an ASPX page I've designed on my local machine by browsing to http://localhost/page.aspx the page looks as expected (and looks the same in IE, Firefox and Chrome. If I change localhost to my_hostname the page is rendered with a disabled vertical scroll bar. The behavior was first noticed when I published my site to our live server and saw the same discrepancy. After beating my head against the wall I tried what I described above and was able to duplicate my "problem". So with that, I turn to you guys. This wouldn't really be an issue (save for the cross-browser inconsistency) except that this screws up an "absolute"ly positioned <div> moving it partway off the screen instead of being centered like it should be (and is when viewed any other way except in IE when the address is anything but localhost).

    Read the article

  • udev: waiting for uevents to be processed on my Gentoo

    - by stan31337
    During the startup I see machine executing this thing for about 30 seconds: udev: waiting for uevents to be processed Then I get a quick message which says something like: devfs: timeout (50 seconds) I can't see the whole thing because after that system starts up very fast including Xfce. What logs and configs do I need to provide for further investigation? $uname -a Linux genta 3.6.6-gentoo #1 SMP Sun Nov 11 11:02:23 NOVT 2012 i686 Genuine Intel(R) CPU T2300 @ 1.66GHz GenuineIntel GNU/Linux Thank you! UPD: rc-status genta / # rc-status sysinit Runlevel: sysinit dmesg [ started ] udev [ started ] devfs [ started ] genta / # rc-status boot Runlevel: boot hwclock [ started ] modules [ started ] fsck [ started ] root [ started ] mtab [ started ] localmount [ started ] sysctl [ started ] bootmisc [ started ] hostname [ started ] termencoding [ started ] keymaps [ started ] net.lo [ started ] swap [ started ] urandom [ started ] procfs [ started ]

    Read the article

  • How much processor speed and cores do I need for these tasks?

    - by ajay
    I am planning to buy a new laptop as I find my current one very slow. My question here is specifically related to RAM size and CPU power. I will mostly be doing development (not much games). I would be dabbling in distributed computing, multithreaded and data intensive parallelizable tasks on multi-cores. For e.g. I would want to be able to Concurrent programming in Scala/Java/Clojure etc. and be able to see parallelization. Furthermore, I would want the RAM to be enough. But from a developer machine standpoint, do you think 4GB RAM and 2.53GHz Dual Core processor would be enough. I'm basically looking at this model: http://store.apple.com/us/configure/MC118LL/A?mco=MTM3NDcyODk (link dead)

    Read the article

  • How to get a good current VMWare browser appliance?

    - by Brooks Moses
    I'd like to have a small VMWare virtual machine that runs a copy of Firefox, with Flash enabled. (Or some equivalently-capable browser.) I tried doing some Google searching with no luck finding good keywords, and tried looking through VMWare's "marketplace" of VMs, but all I found were things from 2006 or so. Is there a reasonably easy way to get a current one? Ideally, I'd like to just download one somewhere, but in the alternative, a quick how-to guide would be useful. I know I could go through the whole process of getting a full-Linux-install VM and setting things up, but that seems like quite a lot of trouble and ends up with a pretty heavyweight solution to the problem, so I'm hoping there's a simpler way.

    Read the article

  • Collaboration using github and testing the code

    - by wyred
    The procedure in my team is that we all commit our code to the same development branch. We have a test server that runs updated code from this branch so that we can test our code on the servers. The problem is that if we want to merge the development branch to the master branch in order to publish new features to our production servers, some features that may not have been ready will be applied to the production servers. So we're considering having each developer work on a feature/topic branch where each of them work on their own features and when it's ready, merge it into the development branch for testing, and then into the master branch. However, because our test server only pulls changes from the development branch, the developers are unable to test their features. While this is not a huge issue as they can test it on their local machine, the only problem I foresee is if we want to test callbacks from third-party services like sendgrid (where you specify a url for sendgrid to update you on the status of emails sent out). How to handle this problem? Note: We're not advanced git users. We use the Github app for MacOSX and Windows to commit our work.

    Read the article

  • PFSENSE and IPV6 , direct connect rules

    - by Bgnt44
    My question is about pfsense configuration for ipv6 In theory Ipv6 are fully routable even in a LAN For stating point i ve Using this tutorial : http://doc.pfsense.org/index.php/Using_IPv6_on_2.1_with_a_Tunnel_Broker So my Lan network has ipv4 connection and ipv6 I would like to be able to access my LAN machines by their IPV6 i'm confused with firewall rules which i need to set to be able to do that Even if i set all interfaces to pass all packets, i'm not able to directly access any machine by their IPV6 Did i miss something ? Edit : Ok i found that it work now, think it has always work but my isp seems to support ipv6 sometimes and sometimes not ... weird

    Read the article

  • Can not change to a static IP in Fedora 19

    - by user196272
    Im having a bit of a weird situation. Ive installed Fedora Linux 19 onto a virtual machine with no GUI. initially eth0 does not show up when I perform ifconfig. when I run dmesg | grep eth I see the adapter but it says it changed names to p2p1. Once I perform the ifconfig p2p1 up command it shows up. Now when I try to edit the /etc/sysconfig/network-scripts/ifcfg-p2p1, it does not exist. the only scripts that are there lo and enp0s3. If I try to create the ifcfg-p2p1 file with the correct settings, I can not restart the network service. I tried editing the enp0s3 file, but that did not work. Im fairly new to linux and not sure what else to put in here, so if you need any more information just let me know and Ill put it in here.

    Read the article

  • Red Gate's on the road in 2012 - Will you catch us?

    - by RedAndTheCommunity
    Annabel Bradford, our Communities and Events Manager, tells all about her experience of our 1st SQL Saturday of the year. The first stop this year was SQL Saturday #104 Colorado Springs, back in early January. I made the trip across from the UK just for this SQL Saturday event, and I'm so glad I did. I picked up Max from Red Gate's Pasadena office and we flew into Colorado Springs airport late on Friday evening to be greeted by freezing temperatures, which was quite a shock after the California sunshine. Rising before the sun, we arrived at Mr Biggs, the venue for the event, in the darkness. It was great to see so many smiling attendees so bright and early on a Saturday morning. Everyone was eager to learn more about SQL Server, and hundreds of people came and chatted with us at the table, saw demos and learnt more about Red Gate tools. The event highlights for the attendees were definitely the unlimited lazer quest, bowling and pool available during the break times. For Max, Grant Fritchey and I on the Red Gate table, the highlights have to be meeting customers and getting the opportunity to meet attendees who'd heard of, but wanted to know more about, Red Gate. We were delighted to hear lots of valuable feedback that we took back to share with the team. As a thank you for sharing insights about their work lives and how they use SQL Server and Red Gate tools, attendees are able to take away Red Gate SQL Server books. We aim to have a range of titles available when we exhibit, so that attendees can choose a book that's going to be most interesting to them, and that they can use as a reference back at the office. Every time I meet a Red Gate user or a member of the SQL community, I'm always overwhelmed by the enthusiasm they have for their industry. Everyone who gives up their time to learn more about their job should be rewarded, and at Red Gate we like to do just that. Red Gate has long supported the SQL community through sponsorship to facilitate user group meetings and community events, but it's only though face-to-face contact that we really get a chance to see the impact of our support. I hope we'll have the chance to see you on the road at some point this year. We'll be at a range of events, including free SQL Saturdays, one day free events 'the Red Gate way', two-day Rallys, and full-week conferences. Next stop is SQL Saturday #109 Silicon Valley on March 3rd where you'll meet Jeff and Arneh, two of our US-based SQL team members. Be sure to ask them any questions you've got about the Red Gate tools, as these guys will be delighted to hear your questions, show you the options, and will make a note of your feedback to send through to the development team. Until the next time. Happy learning! Annabel                         Grant, Max and Annabel at SQL Saturday #104 Colorado Springs

    Read the article

  • How to force a host to not send a broadcast for an IP address in its own subnet?

    - by Bruce
    For a LAN, instead of a switch, I have built a topology where each machine is connected to a router. Each host is assigned an IP address from 10/8. Here are the interface details: Lets say I ping 10.16.0.3 from this host. The routing table of 10.16.0.2 has been configured to use the router (10.16.0.1) as the default gateway. But since the destination IP address (10.16.0.3) is in the same subnet it sends out an ARP broadcast. I want to disable this behavior of sending an ARP broadcast and instead force it to use the routing table. How do I accomplish this?

    Read the article

  • VirtualBox error 0x80004005

    - by maria
    Hi I was trying to run Windows XP under Virtual Box (host system Ubuntu 10.04) and I've got an error message saying: Kernel driver not installed (rc=-1908) The VirtualBox Linux kernel driver (vboxdrv) is either not loaded or there is a permission problem with /dev/vboxdrv. Re-setup the kernel module by executing '/etc/init.d/vboxdrv setup' as root. Users of Ubuntu, Fedora or Mandriva should install the DKMS package first. This package keeps track of Linux kernel changes and recompiles the vboxdrv kernel module if necessary. This is terminal output: maria@maria-laptop:~$ sudo /etc/init.d/vboxdrv setup * Stopping VirtualBox kernel modules * done. * Uninstalling old VirtualBox DKMS kernel modules * done. * Trying to register the VirtualBox kernel modules using DKMS * Failed, trying without DKMS * Recompiling VirtualBox kernel modules * done. * Starting VirtualBox kernel modules * done. After that I can run XP on Virtual Box, but when I try to log into user's account, I have the message that I have to register XP (I haven't done it yet, as each time I was cliking on pop-up window suggesting registration, nothing was happening). I click that I want to register it now and appears the message about error 0x80004005, which prevents Windows to check the license for that computer (the message is in Polish, so I don't post it here...). I forgot to install DKMS first, but I've done it after. What should I do to run virtual machine? It was using it already, the problem occured only today. Thanks for any suggestion

    Read the article

  • Install NPM Packages Automatically for Node.js on Windows Azure Web Site

    - by Shaun
    In one of my previous post I described and demonstrated how to use NPM packages in Node.js and Windows Azure Web Site (WAWS). In that post I used NPM command to install packages, and then use Git for Windows to commit my changes and sync them to WAWS git repository. Then WAWS will trigger a new deployment to host my Node.js application. Someone may notice that, a NPM package may contains many files and could be a little bit huge. For example, the “azure” package, which is the Windows Azure SDK for Node.js, is about 6MB. Another popular package “express”, which is a rich MVC framework for Node.js, is about 1MB. When I firstly push my codes to Windows Azure, all of them must be uploaded to the cloud. Is that possible to let Windows Azure download and install these packages for us? In this post, I will introduce how to make WAWS install all required packages for us when deploying.   Let’s Start with Demo Demo is most straightforward. Let’s create a new WAWS and clone it to my local disk. Drag the folder into Git for Windows so that it can help us commit and push. Please refer to this post if you are not familiar with how to use Windows Azure Web Site, Git deployment, git clone and Git for Windows. And then open a command windows and install a package in our code folder. Let’s say I want to install “express”. And then created a new Node.js file named “server.js” and pasted the code as below. 1: var express = require("express"); 2: var app = express(); 3: 4: app.get("/", function(req, res) { 5: res.send("Hello Node.js and Express."); 6: }); 7: 8: console.log("Web application opened."); 9: app.listen(process.env.PORT); If we switch to Git for Windows right now we will find that it detected the changes we made, which includes the “server.js” and all files under “node_modules” folder. What we need to upload should only be our source code, but the huge package files also have to be uploaded as well. Now I will show you how to exclude them and let Windows Azure install the package on the cloud. First we need to add a special file named “.gitignore”. It seems cannot be done directly from the file explorer since this file only contains extension name. So we need to do it from command line. Navigate to the local repository folder and execute the command below to create an empty file named “.gitignore”. If the command windows asked for input just press Enter. 1: echo > .gitignore Now open this file and copy the content below and save. 1: node_modules Now if we switch to Git for Windows we will found that the packages under the “node_modules” were not in the change list. So now if we commit and push, the “express” packages will not be uploaded to Windows Azure. Second, let’s tell Windows Azure which packages it needs to install when deploying. Create another file named “package.json” and copy the content below into that file and save. 1: { 2: "name": "npmdemo", 3: "version": "1.0.0", 4: "dependencies": { 5: "express": "*" 6: } 7: } Now back to Git for Windows, commit our changes and push it to WAWS. Then let’s open the WAWS in developer portal, we will see that there’s a new deployment finished. Click the arrow right side of this deployment we can see how WAWS handle this deployment. Especially we can find WAWS executed NPM. And if we opened the log we can review what command WAWS executed to install the packages and the installation output messages. As you can see WAWS installed “express” for me from the cloud side, so that I don’t need to upload the whole bunch of the package to Azure. Open this website and we can see the result, which proved the “express” had been installed successfully.   What’s Happened Under the Hood Now let’s explain a bit on what the “.gitignore” and “package.json” mean. The “.gitignore” is an ignore configuration file for git repository. All files and folders listed in the “.gitignore” will be skipped from git push. In the example below I copied “node_modules” into this file in my local repository. This means,  do not track and upload all files under the “node_modules” folder. So by using “.gitignore” I skipped all packages from uploading to Windows Azure. “.gitignore” can contain files, folders. It can also contain the files and folders that we do NOT want to ignore. In the next section we will see how to use the un-ignore syntax to make the SQL package included. The “package.json” file is the package definition file for Node.js application. We can define the application name, version, description, author, etc. information in it in JSON format. And we can also put the dependent packages as well, to indicate which packages this Node.js application is needed. In WAWS, name and version is necessary. And when a deployment happened, WAWS will look into this file, find the dependent packages, execute the NPM command to install them one by one. So in the demo above I copied “express” into this file so that WAWS will install it for me automatically. I updated the dependencies section of the “package.json” file manually. But this can be done partially automatically. If we have a valid “package.json” in our local repository, then when we are going to install some packages we can specify “--save” parameter in “npm install” command, so that NPM will help us upgrade the dependencies part. For example, when I wanted to install “azure” package I should execute the command as below. Note that I added “--save” with the command. 1: npm install azure --save Once it finished my “package.json” will be updated automatically. Each dependent packages will be presented here. The JSON key is the package name while the value is the version range. Below is a brief list of the version range format. For more information about the “package.json” please refer here. Format Description Example version Must match the version exactly. "azure": "0.6.7" >=version Must be equal or great than the version. "azure": ">0.6.0" 1.2.x The version number must start with the supplied digits, but any digit may be used in place of the x. "azure": "0.6.x" ~version The version must be at least as high as the range, and it must be less than the next major revision above the range. "azure": "~0.6.7" * Matches any version. "azure": "*" And WAWS will install the proper version of the packages based on what you defined here. The process of WAWS git deployment and NPM installation would be like this.   But Some Packages… As we know, when we specified the dependencies in “package.json” WAWS will download and install them on the cloud. For most of packages it works very well. But there are some special packages may not work. This means, if the package installation needs some special environment restraints it might be failed. For example, the SQL Server Driver for Node.js package needs “node-gyp”, Python and C++ 2010 installed on the target machine during the NPM installation. If we just put the “msnodesql” in “package.json” file and push it to WAWS, the deployment will be failed since there’s no “node-gyp”, Python and C++ 2010 in the WAWS virtual machine. For example, the “server.js” file. 1: var express = require("express"); 2: var app = express(); 3: 4: app.get("/", function(req, res) { 5: res.send("Hello Node.js and Express."); 6: }); 7:  8: var sql = require("msnodesql"); 9: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:tqy4c0isfr.database.windows.net,1433;Database=msteched2012;Uid=shaunxu@tqy4c0isfr;Pwd=P@ssw0rd123;Encrypt=yes;Connection Timeout=30;"; 10: app.get("/sql", function (req, res) { 11: sql.open(connectionString, function (err, conn) { 12: if (err) { 13: console.log(err); 14: res.send(500, "Cannot open connection."); 15: } 16: else { 17: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 18: if (err) { 19: console.log(err); 20: res.send(500, "Cannot retrieve records."); 21: } 22: else { 23: res.json(results); 24: } 25: }); 26: } 27: }); 28: }); 29: 30: console.log("Web application opened."); 31: app.listen(process.env.PORT); The “package.json” file. 1: { 2: "name": "npmdemo", 3: "version": "1.0.0", 4: "dependencies": { 5: "express": "*", 6: "msnodesql": "*" 7: } 8: } And it failed to deploy to WAWS. From the NPM log we can see it’s because “msnodesql” cannot be installed on WAWS. The solution is, in “.gitignore” file we should ignore all packages except the “msnodesql”, and upload the package by ourselves. This can be done by use the content as below. We firstly un-ignored the “node_modules” folder. And then we ignored all sub folders but need git to check each sub folders. And then we un-ignore one of the sub folders named “msnodesql” which is the SQL Server Node.js Driver. 1: !node_modules/ 2:  3: node_modules/* 4: !node_modules/msnodesql For more information about the syntax of “.gitignore” please refer to this thread. Now if we go to Git for Windows we will find the “msnodesql” was included in the uncommitted set while “express” was not. I also need remove the dependency of “msnodesql” from “package.json”. Commit and push to WAWS. Now we can see the deployment successfully done. And then we can use the Windows Azure SQL Database from our Node.js application through the “msnodesql” package we uploaded.   Summary In this post I demonstrated how to leverage the deployment process of Windows Azure Web Site to install NPM packages during the publish action. With the “.gitignore” and “package.json” file we can ignore the dependent packages from our Node.js and let Windows Azure Web Site download and install them while deployed. For some special packages that cannot be installed by Windows Azure Web Site, such as “msnodesql”, we can put them into the publish payload as well. With the combination of Windows Azure Web Site, Node.js and NPM it makes even more easy and quick for us to develop and deploy our Node.js application to the cloud.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Awstats messaging non existant user causing exim4 to go nuts

    - by Chris
    I've taken over managing a server set up by someone else now uncontactable, while managing to work out most faults / changes needed this one is stumping me. Awstats is running on the machine and sending messages via exim4 to a user every time it runs an update. The user account has been deleted and so the exim4 main log files are filling up with message delivery errors, which firstly hinders meaningful log analysis for anything else and secondly uses up quite a lot of space (it grew to 22GB unattended, panic!) I've been through all the conf files in /etc/awstats and can't seem to find any mention of this user account. Google just turns up results about how to use awstats to parse exim4 log files. So the questions is where is this setting (on debian) likely to be? Cheers in advance

    Read the article

  • Focus on Social Relationship Management at Oracle OpenWorld

    - by Pat Ma
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} 0 0 1 422 2408 involver 20 5 2825 14.0 Normal 0 false false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin;} Greetings from Oracle OpenWorld 2012. Today, we’re going to focus on Social Relationship Management at Oracle OpenWorld.?Social networking is touching all businesses today.  Customers are speaking about your brand right now on social media sites. Your employees are speaking to one another on social media sites. In an Oracle survey, 40% of consumers factor in Facebook recommendations when making purchasing decisions. Despite the rise of social networking, 70% of marketers report having little understanding of social media conversations happening around their brand. Oracle has invested in technologies that will help companies leverage social media technologies for their enterprise. Our suite of social products is collectively known as Social Relationship Management. Customers are using Social Relationship Management to get analytics to social media conversations around their brand, manage multiple social media channels while keeping their brand consistent, optimize internal workflows and processes, and create better customer relationships and experiences. In this example, using Social Relationship Management, a high-end national grocery chain is able to see that “Coconut Water” is trending in San Francisco. They are now able to send a $2-off coconut water coupon to shoppers who have checked into their San Francisco locations. This promotion further drives sales of coconut water in San Francisco. In another example, using Social Relationship Management, a technology company creates multiple Facebook pages and runs campaigns on them. These social campaigns are now integrated and tracked as another marketing channel in Oracle Fusion CRM. The technology company can now track and respond to a particular customer as he moves across multiple channels – without having to restart the conversation each time the customer contacts the company. Furthermore, the technology company can see in one interface what marketing channels – including social – is performing best for each promotion. Besides being a Software-as-a-Service solution, social is also a Platform-as-a-Service solution. The benefit here is that customers can extend the functionality of our social applications to suit their particular needs or create their own social application from scratch. During the Social Developer track, developers are learning how to use Java and other industry-standard programming languages to plug in social functionality to enterprise applications. To see how Social Relationship Management can help your business build better relationships and experience with customers, visit us on the web at oracle.com/social. There are a lot more social-oriented sessions left at OpenWorld. To view a schedule of the upcoming social-oriented sessions, go here.

    Read the article

  • What should be taken into consideration when deploying Windows 8 in a domain environment?

    - by GaTechThomas
    Edit: Reformulating the question: We have ordered new laptops but before they arrive, our development team is trying to decide whether to install Windows 8 or stick with Windows 7. We have already tested on isolated machines, but we have not yet been allowed to add the machines to the domain. Before we approach the networking group to discuss adding Windows 8 machines to the domain, we need more information on what changes / issues to expect in moving from Windows 7. Are there any aspects we should consider that are specific to Windows 8 clients? Thus far, I've gotten the following feedback: Windows Administrative Shares are disabled New set of Group Policy templates Changes to proxy server settings Additional items along these lines would be helpful. We're not looking for items related to Windows GUI changes, but instead primarily items related to having the machine live and be used on the domain.

    Read the article

  • Ubuntu on Android for Samsung Galaxy note 2?

    - by schulzey
    This question is lengthy, I have to warn you. I'm generally a mac user. My setup currently consists of using mac osx and vmware (windows 7) to run one program that isn't compatible with osx or linux. This program can be run using crossover (wine) as well but it works better with vmware. I'm thinking of switching over to ubuntu as my main operating system on my computers and running vmware or crossover that way to access the program I need. No specific reason why, I guess I'm just sick of the apple/windows machine. I've always wanted to try linux so figure there is no better time than now. The point is I haven't used ubuntu yet so I don't know a lot about it. I was planning on buying the iphone 5 when I saw there is an app to setup ubuntu operating system on an android phone. I'm now thinking that this would be fantastic for me if I bought the samsung galaxy note 2 and was able to install ubuntu and either vmware or crossover to use my windows specific program for work right on my phone. The samsung galaxy note 2 has a 1.4 ghz processor and 2gb of ram which is enough to handle a modern operating system. My program does not need a lot of speed or memory to work. To be honest, the windows specific program I need would work fine on an old laptop running windows xp. My first question is if ubuntu for android is really the full operating system that lets you run programs just like a desktop pc? Is it super slow where it takes many minutes to load up ubuntu? I don't need blazing speeds, but I'd like something useable. My next question has to do with vmware or crossover. Is ubuntu for android capable of running these programs? I think it would be great to use the same operating system like ubuntu on my desktop, laptop, tablets, and phone. Thanks so much for all the help!!!

    Read the article

  • Windows 7 Natively on Mac OS X Bootcamp, Airport Wifi Unable To Connect....

    - by Goober
    Hello! I am using a brand new MacBook Pro. I am running a copy of Windows 7 natively via bootcamp (No use of Virtual Machine Software at all). However the only way I can get Windows to connect to the internet is via ethernet, as opposed to the Macs Airport card picking up the wireless. It just refuses to connect, and gives me a limited access status. Any ideas!? I've run Windows XP natively via bootcamp and I had a few issues with the network constantly dropping out, however I blamed that on the drivers and the general shiteness of XP.... Help greatly appreciated.

    Read the article

  • FileOpenPicker/FileSavePicker doesn't allow *.* wildcard file associations

    - by mbrit
    On Twitter, Matthias Jauernig commented that the FileOpenPicker and FileSavePicker doesn't allow *.* wildcard file associations. I was relaxed about this and wrote back that it was related to sandboxing implying it was a "good thing", however as Matthias commented back, perhaps it's not.In Metro-style the sandboxing works that if something gives you a file (e.g. the picker, or a share operation), you can access it regardless of where on the system. If you find the file yourself, you have to declare the type.The reason why I think it's related to sandboxing is because if you work with files programmatically you have to be explicit about the file types. This is to stop malware that you think is only interested in - say .PDF files, scanning and uploading any .EML files that it can find on the machine. It follows then on the pickers that restriction would continue. It allow's the retail store team to validate that an app is likely to behave itself. If it's an app that works with images, locking down the picker so that it can only access image file types makes sense.However Matthias mentioned that he has an app that should allow files of any arbitrary file. That fits more into the "if the user selects it, it must be OK" camp than the "programmatic scanning" camp. So now I'm left wondering why the picker doesn't allow any type to be selected.I think then maybe the decision comes down to simplicity. A lot of the decisions in Metro-style design relate to ideas about "zero intimidation". Allow the user to select any file is too much like Old Windows, and not enough like Reimagined Windows. What happens in Matthias's app if the user selects Explorer.exe as the file he or she wants to work with? I guess it's fine if you expect your user to know what they're doing (Old Windows), but not so fine if you're expecting a three year old to work with it (Reimagined Windows).

    Read the article

  • Running 'sudo' over SSH

    - by Wesho
    I'm writing a script which is to log onto a bunch of remote machines and run a command on them. I've set up keys so the user running the script does not have to type the password of each machine, but only type in the passphrase in the beginning of the script. The problem is that the command on the remote machines requires sudo to run. And at the same time the whole point of the script is to rid the user of having to type in passwords multiple times. Is there way to avoid typing in the password for sudo? Changing permissions of the command on the remote machines is not an option.

    Read the article

  • Good window management grid keyboard shortcuts on keyboards without a numeric keypad

    - by Bryce Thomas
    I like to use Winsplit Revolution to position open windows in a specific place on my screen in a grid-like fashion. One of the things I like about Winsplit Revolution is that the default keyboard shortcuts use the physical layout of the numeric keypad as a mnemonic for where each key positions a window (e.g. Ctrl + Alt + 7 positions window in top left hand corner because 7 is in top left hand corner and Ctrl + Alt + 3 positions window in bottom right hand corner because 3 is in bottom right hand corner). I am looking to get a laptop (Macbook Pro) whose keyboard does not feature a numeric keypad. Can anyone suggest a set of keyboard shortcuts on such a machine that provides a similar mnemonic to aid in remembering what each shortcut does, rather than a simple arbitrary assignment of shortcuts? To be clear, I am not interested in specific window management software, just suggestions for keyboard shortcuts that are easy to remember.

    Read the article

  • cassandra node discovery

    - by eQuiNoX__
    I just set up a 3 node system with ip addresses "192.168.0.101", "192.168.0.102", "192.168.0.103". I have set the seeds value on the configuration of all three machines as seeds: "192.168.0.101,192.168.0.102,192.168.0.103" However, on running nodetool on any of them, only the 103 machine gets discovered. node101:/opt/cassandra/apache-cassandra-0.8.5/bin# ./nodetool ring -h 192.168.0.101 Address DC Rack Status State Load Owns Token 192.168.0.103 datacenter1 rack1 Up Normal 151.96 KB 100.00% 38174485210079977599903748344879358256 Could someone tell me where the problem lies?

    Read the article

< Previous Page | 637 638 639 640 641 642 643 644 645 646 647 648  | Next Page >