Search Results

Search found 29495 results on 1180 pages for 'cross site scripting'.

Page 804/1180 | < Previous Page | 800 801 802 803 804 805 806 807 808 809 810 811  | Next Page >

  • Open Source Scheduling Software?

    - by Kaiser Advisor
    Hi Everyone, I'm looking for scheduling software to schedule 25 people over 8 work sites. Most are FT and can work up to 40 hours a week, but some are part-time and can only work certain days of the week and up to a certain number of hours a week. There are 3 classes of employees: Managers, Supervisors, and Workers. They should be shuffled so that they spend approximately equal time at each of the 8 work sites and with all classes of employees; i.e., Joe the worker should spend about 1 out of 8 days on each work site, and work with managers, supervisors, and other workers equally. I tried to do this in excel with the solver, but the shuffling requirement makes it way too complicated, so I'm stuck trying to do big parts of this manually with the solver helping out with just the hour provisioning piece. Is there any open source software that could help me? Much appreciated! KA

    Read the article

  • Great Blog Comments

    - by Paul Sorensen
    Just a quick note to let you know that in the interest of keeping the most useful content available here on the Oracle Certification Blog, we do moderate the comments. We welcome (and encourage dialog, questions, comments, etc) here on the topics at hand. We'll never 'censor' out a comment just because we don't like it - in fact, this is how we often learn ways in which we can do better. But of course we will filter out the typical list like anyone else: crude/offensive remarks, foul language, reference to illegal activity, etc. We will also often redirect any customer-service type inquiries to [email protected] where they can best be handled.Also, if you have a question of a general nature, please research it on the Oracle Certification website first. We often won't respond to questions asking such as "tell me how to get 11g ocp", as we've already made sure that you have that kind of information available. Now if we've inadvertently 'hidden' something on our site (gulp), then fair enough - please let us know that you're having a hard time finding it and we'll be sure to try and "unbury it" ;-)Additionally, you may have more of an 'opinion' type question, such as "should I do 'x' certification or 'y' certification." For these, we highly recommend checking on the Oracle Technology Network (OTN) Certification Forum, where you can engage in peer-to-peer discussions, share techniques, advice and best practices with others in the field.In the meantime, please continue to share your thoughts, ideas, opinions, tech tips etc - we look forward to seeing them and passing them wherever we can!QUICK LINKS:Oracle Certification WebsiteEmail - Customer ServiceOracle Technology Network (OTN) Certification Forum

    Read the article

  • Nginx or Apache for a VPS?

    - by James
    I consider myself to be an inexperienced user/administrator when it comes to running my VPS. I can get by with a few CLI commands, I can set up Webmin and I can set up Yum repos, but beyond the very basic stuff, I'm out of my depth. So far, I'm running Apache. I don't know it particularly well, but I can get by with editing httpd.conf if I'm told what to edit. I've heard good things about Nginx and that it's not as resource-hungry as Apache. I'd like to give it a go, but I can't find any information about its suitability for administrators like me, with little experience of sysadmin or web server config. Webmin now has support for Nginx, so getting it installed and running probably won't be too much of a problem. What I'm wondering is, from a site administrator perspective, is running Nginx as transparent as running Apache? IE, at the moment, I can just throw up Wordpress and Drupal sites without having much to worry about or having to make any config changes to Apache. Would Nginx be as transparent?

    Read the article

  • From DBA to Data Analyst

    - by Denise McInerney
    Cross posted from the PASS Blog There is a lot changing in the data professional’s world these days. More data is being produced and stored. More enterprises are trying to use that data to improve their products and services and understand their customers better. More data platforms and tools seem to be crowding the market. For a traditional DBA this can be a confusing and perhaps unsettling time. It’s also a time that offers great opportunity for career growth. I speak from personal experience. We sometimes refer to the “accidental DBA”, the person who finds herself suddenly responsible for managing the database because she has some other technical skills. While it was not accidental, six months ago I was unexpectedly offered a chance to transition out of my DBA role and become a data analyst. I have since come to view this offer as a gift, though at the time I wasn’t quite sure what to do with it. Throughout my DBA career I’ve gotten support from my PASS friends and colleagues and they were the first ones I turned to for counsel about this new situation. Everyone was encouraging and I received two pieces of valuable advice: first, leverage what I already know about data and second, work to understand the business’ needs. Bringing the power of data to bear to solve business problems is really the heart of the job. The challenge is figuring out how to do that. PASS had been the source of much of my technical training as a DBA, so I naturally started there to begin my Business Intelligence education. Once again the Virtual Chapter webinars, local chapter meetings and SQL Saturdays have been invaluable. I work in a large company where we are fortunate to have some very talented data scientists and analysts. These colleagues have been generous with their time and advice. I also took a statistics class through Coursera where I got a refresher in statistics and an introduction to the R programming language. And that’s not the end of the free resources available to someone wanting to acquire new skills. There are many knowledgeable Business Intelligence and Analytics professionals who teach through their blogs. Every day I can learn something new from one of these experts. Sometimes we plan our next career move and sometimes it just happens. Either way a database professional who follows industry developments and acquires new skills will be better prepared when change comes. Take the opportunity to learn something about the changing data landscape and attend a Business Intelligence, Business Analytics or Big Data Virtual Chapter meeting. And if you are moving into this new world of data consider attending the PASS Business Analytics Conference in April where you can meet and learn from those who are already on that road. It’s been said that “the only thing constant is change.” That’s never been more true for the data professional than it is today. But if you are someone who loves data and grasps its potential you are in the right place at the right time.

    Read the article

  • Backup to disk, encrypted, without any installed local software

    - by user30064
    Hi, Ok, this is a tough one, and it might not even be possible, but no harm in asking I guess. I have a Buffalo Terastation file server that I use for network attached storage. After a couple of phone calls to customer services I realised that there is no way to backup to disk encrypted. In effect, I would be carrying unencrypted company data off-site daily, which is obviously unacceptable. I had a go at TrueCrypt, EncFS, and a few others, and as far as I could see all of them required that you install some software on the machine that is to use the file system, which makes sense. Unfortunately the firmware on the Terastation is closed and I cannot install any software (and I can't build from source either, since Buffalo didn't include a compiler). Are there any ways to copy files to disk, where as soon as they are written to the disk they are transparently encrypted, without having to install additional software? I'm not sure it matters too much, but the Terastation firmware is Linux based, although as I mentioned, closed. Many thanks, Andreas

    Read the article

  • Install proprietary drivers 14.04 NVIDIA (steam segmentation issue)

    - by allthosemiles
    Recently, I finally got the official drivers for my NVIDIA 560 Ti card installed on Ubuntu 14.04 (hooray) However I started looking into installing Steam and I'm getting segmentation errors when I try to run the software. I tried installing 32-bit libs and it seemed like they weren't available or they were already installed. Upon further investigation, I found that a solution is to install the proprietary drivers, install steam then switch back to the other drivers. I'm not really sure what "proprietary drivers" are in all honesty. Has anyone gone through this process that could provide some insight here? (I installed the official 64-bit driver from the NVIDIA site for my 560 Ti just for reference. And the Ubuntu version installed is 64-bit as well) Update: This is the error text I get when trying to run steam after installing it via the ubuntu store. Running Steam on ubuntu 14.04 64-bit STEAM_RUNTIME is enabled automatically Installing breakpad exception handler for appid(steam)/version(1401381906_client) /home/dbrewer/.steam/steam.sh: line 755: 3943 Segmentation fault (core dumped) $STEAM_DEBUGGER "$STEAMROOT/$PLATFORM/$STEAMEXE" "$@" mv: cannot stat ‘/home/dbrewer/.steam/registry.vdf’: No such file or directory Installing bootstrap /home/dbrewer/.steam/bootstrap.tar.xz Reset complete! Restarting Steam by request... Running Steam on ubuntu 14.04 64-bit STEAM_RUNTIME has been set by the user to: /home/dbrewer/.steam/ubuntu12_32/steam-runtime Installing breakpad exception handler for appid(steam)/version(1401381906_client) /home/dbrewer/.steam/steam.sh: line 755: 4066 Segmentation fault (core dumped) $STEAM_DEBUGGER "$STEAMROOT/$PLATFORM/$STEAMEXE" "$@" What I get when I run "steam --reset" mv: cannot stat ‘/home/dbrewer/.steam/registry.vdf’: No such file or directory Installing bootstrap /home/dbrewer/.steam/bootstrap.tar.xz Reset complete!

    Read the article

  • How-To Backup, Swap, and Update Your Wii Game Saves

    - by Jason Fitzpatrick
    Whether you want to backup your game saves because you’ve worked so hard on them or you want to import game saves precisely so you don’t have to work so hard, we’ve got you covered. Image adapted from icon set by GasClown. There are a multitude of reasons you might want to export and import game saves from your Wii including: saving the progress on your favorite games before sending in your Wii for service, copying the progress to a friend’s or your secondary Wii, and importing saved games from the web or your friend’s Wii so that you don’t have to bust your ass to unlock all the specialty items yourself. (Here’s looking at you Mario Kart and House of the Dead: Overkill.) Latest Features How-To Geek ETC How To Create Your Own Custom ASCII Art from Any Image How To Process Camera Raw Without Paying for Adobe Photoshop How Do You Block Annoying Text Message (SMS) Spam? How to Use and Master the Notoriously Difficult Pen Tool in Photoshop HTG Explains: What Are the Differences Between All Those Audio Formats? How To Use Layer Masks and Vector Masks to Remove Complex Backgrounds in Photoshop Bring Summer Back to Your Desktop with the LandscapeTheme for Chrome and Iron The Prospector – Home Dash Extension Creates a Whole New Browsing Experience in Firefox KinEmote Links Kinect to Windows Why Nobody Reads Web Site Privacy Policies [Infographic] Asian Temple in the Snow Wallpaper 10 Weird Gaming Records from the Guinness Book

    Read the article

  • Is there a way to allow administrators to change or reset user passwords?

    - by Jon Seigel
    We have a custom MembershipProvider implementation using form-based authentication (FBA) under Sharepoint 2007. I've searched high and low on Google, but only found: Active directory and FBA implementations to allow users to change their own password Active directory instructions (including video!) for administrators to change other users' passwords Have we missed an option to enable the latter under FBA? Should this work by default and is the MembershipProvider misbehaving? The procedure to do this as under active directory would be ideal, but the "Change Password" link does not appear in the Edit User screen. We verified that the logged-in user is a site collection administrator.

    Read the article

  • Building ATLAS (and later Octave w/ ATLAS)

    - by David Parks
    I'm trying to set up ATLAS (so I can later compile octave with ATLAS support). If I'm correct, I still need to build this manually due to the environment specific optimizations. I do see a package for ATLAS, but it looks like it's using the cross platform generic build options (e.g. "it'll be slow"). So, running the configure script as described in the docs seems to go poorly. As a java developer I never do well at making heads or tails of errors in these build processes. Am I missing dependencies (if so is there any documentation on what I need)? allusers@vbubuntu:~/Downloads/atlas3.10.1/build_vbubuntu$ ../configure -b 64 -D c -DPentiumCPS=3000 --with-netlib-lapack-tarfile=/home/allusers/Downloads/lapack-3.5.0.tgz make: `xconfig' is up to date. ./xconfig -d s /home/allusers/Downloads/atlas3.10.1/build_vbubuntu/../ -d b /home/allusers/Downloads/atlas3.10.1/build_vbubuntu -b 64 -D c -DPentiumCPS=3000 -Si lapackref 1 OS configured as Linux (1) Assembly configured as GAS_x8664 (2) Vector ISA Extension configured as SSE3 (6,448) ERROR: enum fam=3, chip=2, mach=0 make[3]: *** [atlas_run] Error 44 make[2]: *** [IRunArchInfo_x86] Error 2 Architecture configured as Corei1 (25) ERROR: enum fam=3, chip=2, mach=0 make[3]: *** [atlas_run] Error 44 make[2]: *** [IRunArchInfo_x86] Error 2 Clock rate configured as 2350Mhz ERROR: enum fam=3, chip=2, mach=0 make[3]: *** [atlas_run] Error 44 make[2]: *** [IRunArchInfo_x86] Error 2 Maximum number of threads configured as 4 Parallel make command configured as '$(MAKE) -j 4' ERROR: enum fam=3, chip=2, mach=0 make[3]: *** [atlas_run] Error 44 make[2]: *** [IRunArchInfo_x86] Error 2 Cannot detect CPU throttling. rm -f config1.out make atlas_run atldir=/home/allusers/Downloads/atlas3.10.1/build_vbubuntu exe=xprobe_comp redir=config1.out \ args="-v 0 -o atlconf.txt -O 1 -A 25 -Si nof77 0 -V 448 -b 64 -d b /home/allusers/Downloads/atlas3.10.1/build_vbubuntu" make[1]: Entering directory `/home/allusers/Downloads/atlas3.10.1/build_vbubuntu' cd /home/allusers/Downloads/atlas3.10.1/build_vbubuntu ; ./xprobe_comp -v 0 -o atlconf.txt -O 1 -A 25 -Si nof77 0 -V 448 -b 64 -d b /home/allusers/Downloads/atlas3.10.1/build_vbubuntu > config1.out make[2]: gfortran: Command not found make[2]: *** [IRunF77Comp] Error 127 make[2]: g77: Command not found make[2]: *** [IRunF77Comp] Error 127 make[2]: f77: Command not found make[2]: *** [IRunF77Comp] Error 127 Unable to find usable compiler for F77; abortingMake sure compilers are in your path, and specify good compilers to configure (see INSTALL.txt or 'configure --help' for details)make[1]: *** [atlas_run] Error 8 make[1]: Leaving directory `/home/allusers/Downloads/atlas3.10.1/build_vbubuntu' make: *** [IRun_comp] Error 2 ERROR 512 IN SYSCMND: 'make IRun_comp args="-v 0 -o atlconf.txt -O 1 -A 25 -Si nof77 0 -V 448 -b 64"' mkdir src bin tune interfaces mkdir: cannot create directory ‘src’: File exists mkdir: cannot create directory ‘bin’: File exists mkdir: cannot create directory ‘tune’: File exists mkdir: cannot create directory ‘interfaces’: File exists make: *** [make_subdirs] Error 1 make -f Make.top startup make[1]: Entering directory `/home/allusers/Downloads/atlas3.10.1/build_vbubuntu' Make.top:1: Make.inc: No such file or directory Make.top:325: warning: overriding commands for target `/AtlasTest' Make.top:76: warning: ignoring old commands for target `/AtlasTest' make[1]: *** No rule to make target `Make.inc'. Stop. make[1]: Leaving directory `/home/allusers/Downloads/atlas3.10.1/build_vbubuntu' make: *** [startup] Error 2 mv: cannot move ‘lapack-3.5.0’ to ‘../reference/lapack-3.5.0’: Directory not empty mv: cannot stat ‘lib/Makefile’: No such file or directory ../configure: 450: ../configure: cannot create lib/Makefile: Directory nonexistent ../configure: 451: ../configure: cannot create lib/Makefile: Directory nonexistent ../configure: 452: ../configure: cannot create lib/Makefile: Directory nonexistent ../configure: 453: ../configure: cannot create lib/Makefile: Directory nonexistent ../configure: 509: ../configure: cannot create lib/Makefile: Directory nonexistent DONE configure

    Read the article

  • How to fix sound in wolfenstein Enemy Territory

    - by GrizzLy
    I installed wolf:et, and i cant get sound to work. Everything that i have installed is in default paths, i had 10.4 and then upgraded to 10.10 via software update gui. I had sound working in 10.04 with method under 2. I have tried following killall esd; et; esd with that i get ------- sound initialization ------- /dev/adsp: No such file or directory Could not open /dev/adsp ------------------------------------ sudo -i echo "et.x86 0 0 direct" /proc/asound/card0/pcm0p/oss echo "et.x86 0 0 disable" /proc/asound/card0/pcm0c/oss exit with that i get bash: /proc/asound/card0/pcm0p/oss: No such file or directory and indeed i do not have that, i have only sub0 and sub1 in pcm0p I have tried running et with et-sdl-sound script, but with that i get this output in console http://pastebin.com/J7gRU1uh I have probably messed up sdl libraries, could not get sound to work, so downloaded new from debian package site and installed them. Tried setting SDL_AUDIODRIVER="pulse" in et-sdl-sound, looks like i am getting same error as in method 3. pasuspender -- et +set s_alsa_pcm plughw:0 gives me ------- sound initialization ------- /dev/adsp: No such file or directory Could not open /dev/adsp _------------------------------------ Misc: @Oli: i do not know if i am running pulse or esd, how can i check that?

    Read the article

  • POST attack on my website

    - by benhowdle89
    Hi, I have a site (humanisms.co.uk) which incorporates a voting system, ie. user clicks "Up" and it sends a parameter to a PHP script via AJAX, the PHP inserts vote into MYSQL db and the new "Up" vote is sent back to the page to update the vote count. This is working great but i've noticed that the number of votes for one of my questions shot up last night. I viewed my webhosts access logs and saw this line: 108.27.195.232 - - [03/Mar/2011:15:20:18 +0000] "POST /vote.php HTTP/1.1" 200 2 "http://www.humanisms.co.uk/" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.114 Safari/534.16" This is repeated well over 100 times and sometimes more than once a second. Now i know they probably arent sitting there clicking Vote but running some sort of PHP loop? I'm not worried about SQL injection but what can i do to prevent this same IP address from doing this or what can i do in general to avoid this scenario. I should also say that there's no login so anyone can click using the voting system. Thanks

    Read the article

  • Management Reporter Installation – Lessons Learned

    - by Ryan McBee
    After successfully completing several installations of Management Reporter this year, I wanted to share a few lessons learned that should help you. First, you will want to make sure that you install Management Reporter under a domain account as opposed to a local system or network service account. Management Reporter gives you the option to install under these accounts, but it is a be a best practice approach to use a domain account. Upon installation of Management Report, you will want to make sure that Directory Browsing is enabled within the IIS server of your site or you will have problems when you go to use Management Reporter. By default, it will be disabled in Server 2008 R2 and you will need to make the setting change under the Actions pane shown below. Lastly, you will want to make sure that SQL Server is running under a domain account. I have had multiple situations where reports have been stuck in the Queued status rather than Processing status of Management Reporter. After reviewing resolution 5 of KB 2298248, it was determined that running SQL Server under a domain account is the way to go.

    Read the article

  • Need help configurating my Tomcat server without any WAR files

    - by gablin
    I just reinstalled my entire server, and now I can't seem to get my JSP-based website to work on Tomcat anymore. I use the same server.xml file, which worked perfectly before the reinstallation, but no longer. Here's the content of the server.xml file which worked before: <!--APR library loader. Documentation at /docs/apr.html --> <Listener className="org.apache.catalina.core.AprLifecycleListener" SSLEngine="on" /> <!--Initialize Jasper prior to webapps are loaded. Documentation at /docs/jasper-howto.html --> <Listener className="org.apache.catalina.core.JasperListener" /> <!-- JMX Support for the Tomcat server. Documentation at /docs/non-existent.html --> <Listener className="org.apache.catalina.mbeans.ServerLifecycleListener" /> <Listener className="org.apache.catalina.mbeans.GlobalResourcesLifecycleListener" /> <!-- Global JNDI resources Documentation at /docs/jndi-resources-howto.html --> <GlobalNamingResources> <!-- Editable user database that can also be used by UserDatabaseRealm to authenticate users --> <Resource name="UserDatabase" auth="Container" type="org.apache.catalina.UserDatabase" description="User database that can be updated and saved" factory="org.apache.catalina.users.MemoryUserDatabaseFactory" pathname="conf/tomcat-users.xml" /> </GlobalNamingResources> <!-- A "Service" is a collection of one or more "Connectors" that share a single "Container" Note: A "Service" is not itself a "Container", so you may not define subcomponents such as "Valves" at this level. Documentation at /docs/config/service.html --> <Service name="Catalina"> <!--The connectors can use a shared executor, you can define one or more named thread pools--> <!-- <Executor name="tomcatThreadPool" namePrefix="catalina-exec-" maxThreads="150" minSpareThreads="4"/> --> <!-- A "Connector" represents an endpoint by which requests are received and responses are returned. Documentation at : Java HTTP Connector: /docs/config/http.html (blocking & non-blocking) Java AJP Connector: /docs/config/ajp.html APR (HTTP/AJP) Connector: /docs/apr.html Define a non-SSL HTTP/1.1 Connector on port 8080 --> <Connector port="8080" protocol="HTTP/1.1" connectionTimeout="20000" redirectPort="8443" /> <!-- A "Connector" using the shared thread pool--> <!-- <Connector executor="tomcatThreadPool" port="8080" protocol="HTTP/1.1" connectionTimeout="20000" redirectPort="8443" /> --> <!-- Define a SSL HTTP/1.1 Connector on port 8443 This connector uses the JSSE configuration, when using APR, the connector should be using the OpenSSL style configuration described in the APR documentation --> <!-- <Connector port="8443" protocol="HTTP/1.1" SSLEnabled="true" maxThreads="150" scheme="https" secure="true" clientAuth="false" sslProtocol="TLS" /> --> <!-- Define an AJP 1.3 Connector on port 8009 --> <Connector port="8009" protocol="AJP/1.3" redirectPort="8443" /> <!-- An Engine represents the entry point (within Catalina) that processes every request. The Engine implementation for Tomcat stand alone analyzes the HTTP headers included with the request, and passes them on to the appropriate Host (virtual host). Documentation at /docs/config/engine.html --> <!-- You should set jvmRoute to support load-balancing via AJP ie : <Engine name="Standalone" defaultHost="localhost" jvmRoute="jvm1"> --> <Engine name="Catalina" defaultHost="localhost"> <!--For clustering, please take a look at documentation at: /docs/cluster-howto.html (simple how to) /docs/config/cluster.html (reference documentation) --> <!-- <Cluster className="org.apache.catalina.ha.tcp.SimpleTcpCluster"/> --> <!-- The request dumper valve dumps useful debugging information about the request and response data received and sent by Tomcat. Documentation at: /docs/config/valve.html --> <!-- <Valve className="org.apache.catalina.valves.RequestDumperValve"/> --> <!-- This Realm uses the UserDatabase configured in the global JNDI resources under the key "UserDatabase". Any edits that are performed against this UserDatabase are immediately available for use by the Realm. --> <Realm className="org.apache.catalina.realm.UserDatabaseRealm" resourceName="UserDatabase"/> <!-- Define the default virtual host Note: XML Schema validation will not work with Xerces 2.2. --> <!-- <Host name="localhost" appBase="webapps" unpackWARs="true" autoDeploy="true" xmlValidation="false" xmlNamespaceAware="false"> --> <!-- SingleSignOn valve, share authentication between web applications Documentation at: /docs/config/valve.html --> <!-- <Valve className="org.apache.catalina.authenticator.SingleSignOn" /> --> <!-- Access log processes all example. Documentation at: /docs/config/valve.html --> <!-- <Valve className="org.apache.catalina.valves.AccessLogValve" directory="logs" prefix="localhost_access_log." suffix=".txt" pattern="common" resolveHosts="false"/> --> <!-- </Host> --> <Host name="www.rebootradio.nu"> <Alias>rebootradio.nu</Alias> <Context path="" docBase="D:/services/http/rebootradio.nu" debug="1" reloadable="true"/> </Host> </Engine> </Service> </Server> The JSP site doesn't use any WAR files or anything like that; there's just a default.jsp in the specified folder D:/services/http/rebootradio.nu which loads the site. As I said, this configuration worked before, but now with the latest verion of XAMPP and Tomcat it doesn't work anymore. All I get is a 404 message saying The requested resource () is not available.

    Read the article

  • Oracle OpenWorld 2012: Oracle Developer Cloud, ADF-Essentials, ADF Mobile and ME!

    - by Dana Singleterry
    This year at OOW, like those from the past, will certainly be unforgettable. Lots of new announcements which I can't mention here and may not event know about are sure to surprise. I'll keep this short and sweet. For every session ADF, ADF Mobile, Oracle Developer Cloud, Integration with SOA Suite, etc... take a look at the ADF Focus Document listing all the sessions ordered by day providing time and location. For Mobile specifically check out the Mobile Focus Document. OOW 2012 actually kicks off on Sunday with Moscone North demogrounds hosting Cloud. There's also the ADF EMG User Day where you can pick up many technical tips & tricks from ADF Developers / ACE Directors from around the world. A session you shouldn't miss and a great starting point for the week if you miss Sunday's ADF EMG User Day for all of you TechoFiles is Chris Tonas's keynote for developers - Monday 10:45 am at Salon 8 in the Marriott - The Future of Development for Oracle Fusion - From Desktop to Mobile to Cloud. Then peruse the ADF Focus Document to fill out your day with the many sessions and labs on ADF. Don't forge that Wednesday afternoon (4:30 - 5:30) offers an ADF Meetup which is an excellent opportunity to catch up with the Shakers and Makers of ADF from Product Managent, to customers, to top developers leveraging the ADF technology, to ACE Directors themselves. Not to mention free beer is provided to help you wind down from a day of Techno Overload. Now for my schedule and I do hope to see some of you at one of these. OOW 2012 Schedule 10/1 Monday 9:30am – 12:00pm: JDev DemoGrounds 3:15pm – 4:15pm: Intro to Oracle ADF HOL; Marriott Marquis – Salon ¾ 4:00pm – 6:00pm: Cloud DemoGrounds 10/2 Tuesday 9:45am – 12:00pm: JDev DemoGrounds 2:00pm -4:00pm: Cloud DemoGrounds 7:30 – 9:30: Team Dinner @ Donato Enoteca; Redwood City 10/3 Wednesday 10:15pm – 11:15pm: Intro to Oracle ADF HOL; Marriott Marquis – Salon 3/4 1:15pm – 2:15pm: Oracle ADF – Lessons Learned in Real-World Implementations; Moscone South – Room 309This session takes the form of a panel that consists of three customer: Herbalife, Etiya, & Hawaii State Department of Education. During the first part of this session each customer will provide a high-level overview of their application. Following this overview I'll ask questions of the customers specific to their implementations and lessons learned through their development life-cycle. Here's the session abstract: CON3535 - Oracle ADF: Lessons Learned in Real-World Implementations This session profiles and interviews customers that have been successful in delivering compelling applications based on Oracle’s Application Development Framework (Oracle ADF). The session provides an overview of Oracle ADF, and then three customers explain the business drivers for their respective applications and solutions and if possible, provide a demonstration of the applications. Interactive questions posed to the customers after their overview will make for an exciting and dynamic format in which the customers will provide insight into real-world lessons learned in developing with Oracle ADF. 3:30pm – 4:30 pm: Developing Applications for Mobile iOS and Android Devices with Oracle ADF Mobile; Marriott Marquis – Salon 10A 4:30pm – 6:00pm: Meet and Greet ADF Developers/Customers; OTN Lounge 10/4 Thursday   11:15pm – 12:15pm: Intro to Oracle ADF HOL; Marriott Marquis – Salon 3/4 I'm sure our paths will cross at some point during the week and I look forward to seeing you at one of the many events. Enjoy OOW 2012!

    Read the article

  • How do I open a 60M png image on OSX

    - by Topener
    Alright, so I've been looking around on this site on how to open a big PNG image. The question I found was about a 10M png. Xee apparently did the job. So, I downloaded Xee for my 60M file, but it crashes. So does iPhoto, Pixelmator and previewer. In the Pixelmator and Xee case, I actually had to kill the computer, and restart it. It crashed so 'hard' I couldn't get it to respond again. How do I open this file? (and zoom it) Specs: newly acquired macbook pro: 4gb memory, 2.3ghz i7 58M png image approx: 15000x30000 pixels

    Read the article

  • Why won't my Windows 7 KMS key work on my Server 2008 KMS server?

    - by Ryan Bolger
    Our Microsoft volume licensing site was recently updated to include our Windows 7 and Server 2008 R2 KMS keys. We have an existing KMS server running on Server 2008 (not R2). In an attempt to be proactive about supporting the new OSes in our environment, I unregistered the old KMS key with slmgr.vbs and tried registering the new key. The registration failed with Error 0xC004F050. The description for that error was "The Software Licensing Service reported that the product key is invalid." What's wrong? I've checked and double checked that for typos against what is listed on the Volume Licensing website.

    Read the article

  • Strategy for managing lots of pictures for a website

    - by Nate
    I'm starting a new website that will (hopefully) have a lot of user generated pictures. I'm trying to figure out the best way to store and serve these pictures. The CMS I'm using (umbraco) has a media library that puts a folder on the server for each image. Inside of there you can have different sizes of that same image. That folder has an ID on it and the database has additional information for that image along with the ID of the folder. This works great for small sites, but what if the pictures get up to 10,000, 100,000 or 1,000,000? It seems like the lookup on the directory would take a long time to find the correct folder. I'm on windows 2008 if that makes a difference. I'm not so worried about load. I can load balance my server pretty easily and replicate the images across the servers. The nature of the site won't have a lot of users on it either, but it could have a lot of pics. Thanks. -Nate EDIT After some thought I think I'm going to create a directory for each user under a root image folder then have user's pictures under that. I would be pretty stoked if I had even 5,000 users, so that shouldn't be too bad of a linear lookup. If it does get slow I will break it down into folders like /media/a/adam/image123.png. If it ever gets really big I will expand the above method to build a bigger tree. That would take a LOT of content though.

    Read the article

  • ASP.NET MVC Resource not found

    - by TheLorax
    I am working on an MVC project in Visual Studio 2010 with .NET Framework 4.0 + MVC2 and everything works if I set the target framework to .NET 4.0. However, my host does not offer .NET 4.0 in order to deploy the site I need to get it working on .NET 3.5. I tried converting it to ASP.NET 3.5 and everything builds just fine except now when I try to load the homepage, I get a 404 Error saying: The resource cannot be found. Description: HTTP 404. The resource you are looking for (or one of its dependencies) could have been removed, had its name changed, or is temporarily unavailable. Please review the following URL and make sure that it is spelled correctly. Requested URL: /home Version Information: Microsoft .NET Framework Version:2.0.50727.4927; ASP.NET Version:2.0.50727.4927 Anyone know why this is? Thank You for Your help. TheLorax

    Read the article

  • Today's Links (6/30/2011)

    - by Bob Rhubart
    James Gosling Says He Doesn't Care About Java But here's the rest of the story: "What I really care about is the Java Virtual Machine as a concept," says Gosling, "because that is the thing that ties it all together; it's the thing that makes Java the language possible; it's the thing that makes things work on all kinds of different platforms; and it makes all kinds of languages able to coexist." Virtual Developer Day: SOA Accelerate Your Development with Oracle SOA Suite. Learn how in this FREE on-line workshop with Hands-on labs July 12th 9 am to 1:30 PM PST" July 12th 9 am to 1:30 PM PST Podcast: Toronto Architect Day Panel Discussion Part 3 (of 4) is now available, in which the panel (including Oracle ACE Director Cary Millsap and InfoQ editor and co-founder Floyd Marinescu) discusses public vs private cloud as the best strategy for small businesses and start-ups. WebLogic Weekly for June 27th, 2011 | James Bayer Bayer shares the latest resources for those with WebLogic on the brain. Griffiths Waite at Oracle Open World | Mark Simpson Oracle ACE Director Mark Simpson share information on the presentations he's scheduled to give at Oracle OpenWorld San Francisco 2011. Kscope Solid Service Bus Implementations Peter Paul van de Beek's Kscope11 presentation "is aimed at supporting architects and especially developers to choose the right integration infrastructure for a job." Migration To Java EE 6 With Spring 3 - ...Could Become "Interesting" | Adam Bien "Put simply, big data implies datasets so large they can't normally be processed using a standard transactional database," says David Dorf. "The term 'noSQL' is often used in this context as well." Book Review: "Designing With the Mind In Mind" | Abhinav Agarwal According to Abhinav Agarwal, Jeff Johnson's new book is about "the theory of how the mind perceives information, of how humans understand what they read, and how our eyes are attuned to paying attention to not just what's happening in front of us but also at the periphery of our vision." BPM 11g Advanced Workshop | Martien van den Akker Martien van den Akker shares his thoughts on both the workshop he recently attended and on the Oracle BPM 11g product. Fusion Applications - What You Need To Know: Product Families | Floyd Teter "Fusion Applications are organized into seven groups of related products called Product Families," observes Oracle ACE Director Floyd Teter. "While the product features are organized according to the Business Process Model and can cross the boundaries of product families, the product family groupings are an easy way to wrap your mind around Fusion Apps." Grid Control: Refreshing Weblogic Domains | Dave Best Dave Best shares tips for avoiding problems when using grid control to centrally manage/monitor your environment. Webcast: Oracle to Announce Datanomic Integration Plans The combination of Datanomic technology and the previous acquisition of Silver Creek Systems will deliver a complete, integrated and best-of-breed solution for Data Quality. Learn about Oracle’s strategy and product plans and how the new products acquired from Datanomic will impact your organization. July 19, 2011, 8:00am PT / 11:00am ET. Speakers include Michael Weingartner (Vice President, Product Development, Oracle), Martin Boyd (Senior Director, Product Strategy, Oracle), and Dain Hansen (Director, Product Marketing, Fusion Middleware, Oracle).

    Read the article

  • Need a PCIe desktop graphics card for dual-monitor

    - by Graham
    I have a mid-2008 workstation with two HD monitors supporting HDMI and DVI inputs. Since Ubuntu 11.10, I have experienced no end of trouble with my NVidia Quadro NVS 290 in TwinView dual-monitor output. Others have similar desktop TwinView woes. I want a new graphics card. Previously I asked for a graphics card recommendation and response was Nvidia Geforce GTS 450... but really I'm looking for someone who has actually got a working dual-monitor desktop to tell me what card they use so I can get something that is known to work. So please, people who have no-issues with their 64-bit Ubuntu 12.04 Unity 3D desktop spread across two HD-resolution external monitors (either DVI or HDMI connector), and who also run Google Chrome (which throws a spanner due to its own GPU compositing)... please let me know what graphics card you have so I can buy one. Gathering Options These seem to be the Nvidia cards featuring dual DVI. But they all seem to be gaming cards - what has dual-DVI, good support, but is not a massive gaming card? Nvidia GTS 450 (previously recommended) - 2x DVI Nvidia GTX 550 Ti (used by System76) - 2x DVI Nvidia GT 430 (used by System76) - 1xDVI, 1xHDMI Nvidia GT 640 (found on NVidia site) - 1xDVI, 1xHDMI (also GT 620, GT 630) Has anyone had a good desktop dual-monitor Unity 3D experience with ATI cards?

    Read the article

  • NFS or GFS for LVS 10 Server Setup

    - by Michael Robinson
    Currently we have a 10 servers LVS hosting setup. The people we hired to set it up did not anything about GFS which was our preferred Central Storage File System Solution. As we have tight time constraint, we just told them to use whatever they were familiar with which is NFS. I have since done some research and it seems that NFS is not ideal for the type of high traffic site we are hoping to build. I couldn't find much info online about the signaficance differences between the 2. As we to setup all servers again right now, should we stick with NFS or find someone who knows how to setup GFS amd go with that. We need a setup that is highly reliable and scalable as we intend. As after initial setup is done, we expect high increases in traffic and load.

    Read the article

  • Ask the Readers: How Do You Deal with Bacn?

    - by Jason Fitzpatrick
    Most people get their fair share of email they want, email they don’t want at all (Spam), and a healthy dose of Bacn–email they want but not right now. How do you deal with your daily dose of Bacn? While Spam is unsolicited garbage you don’t ever want, Bacn is email content you’ve actively selected to receive (weather updates, coupons from your favorite retailers, web site digests, etc.) that isn’t as important as email from friends and coworkers. It’s email that you want but not right now. This week we want to hear all about your methods for wrangling Bacn so you can enjoy it when you’re in the mood but it doesn’t clutter up your inbox when you aren’t. Sound off in the comments with your Bacn handling tips and then check back in on Friday for the What You Said roundup to see how your fellow readers handle things. HTG Explains: What The Windows Event Viewer Is and How You Can Use It HTG Explains: How Windows Uses The Task Scheduler for System Tasks HTG Explains: Why Do Hard Drives Show the Wrong Capacity in Windows?

    Read the article

  • Elementary OS boots to a terminal (other OS) [on hold]

    - by Benjamin Watson
    Im new to this site, please forgive me if I missed some posting protocol of some sort. I am attempting to install Luna on my samsung s2 laptop (a8 amd radeon 7640g) and when I click on try luna, it just pulls up a terminal after the insignia (curvy E). When I install it, same issue. CTRL-ALT-f7 reveals this (hand typed, sorry if there's typos) Starting preload: *starting CUPS printing spooler/server *stopping save kernel messages preload. fsck from util-linux 2.20.1 fsck from util-linux 2.20.1 dosfsck 3.0.12, 29 oct 2011 FAT32, LFN /dev/sda1: 3 files, 245/189518 clusters /dev/sda2: clean, 133841/30294016 files, 2529529/121164544 blocks Skipping profile in /etc/apparmor.d/disable: usr.sbin.rsyslogd *starting AppArmor profiles speech-dispacher disabled; edit /etc/default/speech-dispenser *stopping system V initialisation compatibility *starting system V runlevel compatability *starting apci daemon *starting anac(h)ronistic cron *starting save kernal messages *starting ntp server ntpd *starting regular background program processing damon *starting deferred execution scheduler *stopping anac(h)ronistic cron *starting LightDM Display Manager *starting bluetooth daemon *starting mDNS/DNS-SD daemon *starting CPU interrupts balancing daemon *stopping Send an event to indicate plymouth is up saned disabled ; edit /etc/default/saned *starting network connection manager *starting crash report submission daemon *checking battery state... That's it. I can't make heads or tails of it. Please note that while I've been running linux for about a year, I'm still fairly new to all of this, so try to be detailed in your explanations and/or descriptions of what I need to do. Any/all help would be appreciated. Thank you for your time.

    Read the article

  • Screen Resolution stuck at 640x480 after installing Bumblebee

    - by Saurabh Agarwal
    I have a Dell XPS 15z laptop. As you can see here, there are some issues with NVidia drivers. The site recommends installation of Bumblebee (instructions given in the link). I am posting it again for ease: $ sudo add-apt-repository ppa:bumblebee/stable $ sudo apt-get update && sudo apt-get upgrade $ sudo apt-get install bumblebee bumblebee-nvidia $ sudo usermod -a -G bumblebee $USER After restarting the computer however, the screen resolution was stuck at 640x480 and I got the following error message as soon as I logged in: **Could not apply the stored configuration for monitors** none of the selected modes were compatible with the possible modes: Trying modes for CRTC 63 CRTC 63: trying mode 640x480@60Hz with output at 1366x768@60Hz (pass 0) CRTC 63: trying mode 640x480@60Hz with output at 1366x768@60Hz (pass 1) Trying modes for CRTC 64 CRTC 64: trying mode 640x480@60Hz with output at 1366x768@60Hz (pass 0) CRTC 64: trying mode 640x480@60Hz with output at 1366x768@60Hz (pass 1) Prior to the update, the display was absolutely normal and thus there is no doubt about the cause. Albeit, there was no support for graphic drivers. In case it helps, some features of graphics drivers seem to be functional after bumblebee, ie, all features are in order except for the resolution. And if the resolution can't be fixed, please suggest a way to retract the changes so that atleast the prior state may be reachieved. Any help in the matter would be highly appreciated.

    Read the article

  • httpd.conf for case-insensitive file serving

    - by Anton Gogolev
    I'm a complete newbie with regard to managing Apache, so excuse me if I'm phrasing something incorrectly. I have a web site -- say, http://domain.com. The problem is that when I try to open http://domain.com/index.html in a web browser it displays the page, but when I attempt to access http://domain.com/Index.html (note capital I), it responds with HTTP 404. How do I configure Apache to serve both these files (and directories, for that matter) in a case-insensitive manner? Current httpd.conf is here. EDIT Dan C, thanks for a hint. I basically want to allow users to download files from my server and don't really want them to be aware that Index.html and index.html are in fact different. I'm also very willing to know as to what are the ramifications of this decision.

    Read the article

< Previous Page | 800 801 802 803 804 805 806 807 808 809 810 811  | Next Page >