Search Results

Search found 26774 results on 1071 pages for 'distributed development'.

Page 653/1071 | < Previous Page | 649 650 651 652 653 654 655 656 657 658 659 660  | Next Page >

  • GAE, Python 2.5, Python 2.6 Side-by-side on windows

    - by Software Enthusiastic
    Hi On my development system, I have python 2.6, python 1.1 and GAE. I have three projects running on python 2.6 and django 1.1. And 1 project using GAE, Python 2.6 and django 1.1. I have heard that, my set-up for running GAE using python 2.6 may create some head scratching problems while deploying it on the production server, because GAE supports only python 2.5. And using 2.6 is not recommended. Can I develop GAE application using python 2.6? If not what should be the solution, I am using Window vista as my development system.

    Read the article

  • Web Deployment Projects for VS2010 on build server failing with Error MSB4086

    - by SteveBering
    When I upgraded my Web Deployment Project from VS2008 to the VS2010 beta version, I was able to execute the build locally on my development box. However, when I tried to execute the build on our TeamCity build server, I began getting the following exception: C:\Program Files\MSBuild\Microsoft\WebDeployment\v10.0\Microsoft.WebDeployment.targets(162, 37): error MSB4086: A numeric comparison was attempted on "$(_SourceWebProjectPath.Length)" that evaluates to "" instead of a number, in condition "'$(_SourceWebProjectPath)' != '' And $(_SourceWebProjectPath.Length) >= 4)". I did install the Web Deployment Project addin on my build server and I did copy over the C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\WebApplications directory on my development box to the C:\Program Files\MSBuild\Microsoft\VisualStudio\v10.0\ directory on the build server. Note: My dev box is 64bit and the build server 32bit. I can't figure out why this is behaving differently on the build server than on my dev machine. Anyone have any ideas? Thanks, Steve

    Read the article

  • Preventing LDAP injection

    - by Matias
    I am working on my first desktop app that queries LDAP. I'm working in C under unix and using opends, and I'm new to LDAP. After woking a while on that I noticed that the user could be able to alter the LDAP query by injecting malicious code. I'd like to know which sanitizing techniques are known, not only for C/unix development but in more general terms, i.e., web development etc. I thought that escaping equals and semicolons would be enough, but not sure. Here is a little piece of code so I can make clearer the question: String ldapSearchQuery = "(cn=" + $userName + ")"; System.out.println(ldapSearchQuery); Obviously I do need to sanitize $userName, as stated in this OWASP ARTICLE

    Read the article

  • Switching VS2010 to use Windows 7.1 SDK

    - by freefallr
    I've used VS2008 on my development machine for some years now, with windows SDK v7.1. I've installed VS2010, and it's using the Windows SDK v7.0a, but I need it to use the Windows 7.1 SDK (which I had installed prior to installing VS2010). When I run the Windows SDK 7.1 configuration tool, to switch the Windows SDK in use, the tool updates for VS2008, but not for VS2010. The message it reports is: "The Windows SDK Configuration Tool has successfully set Windows SDK version v7.1 as the current version for Visual Studio 2008" The configuration tool is installed with the Windows 7.1 SDK and is found here: "C:\Program Files\Microsoft SDKs\Windows\v7.1\Setup\WindowsSdkVer.exe" VS2010 continues to use WSDK 7.0a, which extremely frustrating, as I need to do DirectShow development (so I need to build the baseclasses, which aren't released with 7.0a release of WSDK). Would I be correct in assuming that it's not updating VS2010 settings because VS2010 wasn't installed at the time that I installed Windows 7.1 SDK? Can I fix this manually, or should I uninstall Windows 7.1 SDK, then reinstall it? Any other suggestions / workarounds for this?

    Read the article

  • Kohana3: Different .htaccess rewritebase and kohana base_url for dev and production environment

    - by Svish
    In my bootstrap.php I have the following: if($_SERVER['SERVER_NAME'] == 'localhost') Kohana::$environment = 'development'; else Kohana::$environment = 'production'; ... switch(Kohana::$environment) { case 'development': $settings = array('base_url' => '/kohana/', 'index_file' => FALSE); break; default: $settings = array('base_url' => '/', 'index_file' => FALSE); break; } In .htaccesshave this: # Installation directory RewriteBase /kohana/ This means that if I just upload my kohana application, it will break because the RewriteBase in the .htaccess file will be wrong. Is there a way I can have a conditional in the .htaccess file similar to the one I have in the bootstrap so that it will use the correct RewriteBase?

    Read the article

  • svn merge - moved repository to a different server, and now getting 'has different repository root'

    - by HorusKol
    This is kind of similar to http://stackoverflow.com/questions/1601021/subversion-merge-has-different-repository-root-than - but appears to be a very different cause (especially as the answer for that question didn't resolve my problem). A while back, we swapped out the server where our SVN repositories are located - but we've been using an alias so that the old server name points to the new server. I've been getting in the habit where I will use the new server name wherever I checkout new working copies - but we having made changes to most of the current working copies as they are live websites. Until now, this hasn't been a problem - except that this morning I merged in some changes from my development branch to a working copy I have of the release version and I got the message "file has different repository root" and the merge stops dead. I know this is because I'm using the new server name when the development branch was updated via the old server name - but is there a simple way to fix this? Or if not a simple way - is there a well-documented way to fix this?

    Read the article

  • Blend 4 breaks VS2010 for Silverlight

    - by Adrian
    Hi, I had VS2010 running fine with Silverlight development. Then I installed Expression Blend 4. Now when I run VS2010 and try to debug a silverlight app I get an error saying "Unable to start debugging. The silverlight developer runtime is not installed. Please install a matching version." I've tried uninstalling silverlight tools, and reinstalling them from scratch (the latest april version). But I still get the same message. So basically I'm now unable to do VS2010 SL development. I'm on the verge of just rolling back to my last system restore point and giving up on Blend. But if I do that I'd be worried that Product Activation would never allow me to reinstall it in the future, since the MSDN download page implies I'm only ever allowed to install it on a single machine. Any help appreciated. Thanks

    Read the article

  • Connect rails application to SQL Server 2005 from Windows

    - by Enrico Carlesso
    Hi guys. I (sadly) have to deploy a rails application on Windows XP which has to connect to Microsoft SQL Server 2005. Surfing in the web there are a lot of hits for connect from Linux to SQL Server, but cannot find out how to do it from Windows. Basically I followed these steps: Install dbi gem Install activerecord-sql-server-adapter gem My database.yml now looks like this: development: adapter: sqlserver mode: odbc dsn: test_dj host: HOSTNAME\SQLEXPRESS database: test_dj username: guest password: guest But I'm unable to connect it. When I run rake db:migrate I get IM002 (0) [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified I'm not a Windows user, so cannot understand really well the meaning of dsn element or so. Does someone have an idea how to solve this? Thanks in advance With Alexander tips now I've modified my database.yml to: development: adapter: sqlserver mode: odbc dsn: Provider=SQLOLEDB;Data Source=SCADA\SQLEXPRESS;UID=guest;PWD=guest;Initial Catalog=test_dj;Application Name=test But now rake db:migrate returns me: S1090 (0) [Microsoft][ODBC Driver Manager] Invalid string or buffer length Am I missing something?

    Read the article

  • Taking the training wheels off: Accelerating the Business with Oracle IAM by Brian Mozinski (Accenture)

    - by Greg Jensen
    Today, technical requirements for IAM are evolving rapidly, and the bar is continuously raised for high performance IAM solutions as organizations look to roll out high volume use cases on the back of legacy systems.  Existing solutions were often designed and architected to support offline transactions and manual processes, and the business owners today demand globally scalable infrastructure to support the growth their business cases are expected to deliver. To help IAM practitioners address these challenges and make their organizations and themselves more successful, this series we will outline the: • Taking the training wheels off: Accelerating the Business with Oracle IAM The explosive growth in expectations for IAM infrastructure, and the business cases they support to gain investment in new security programs. • "Necessity is the mother of invention": Technical solutions developed in the field Well proven tricks of the trade, used by IAM guru’s to maximize your solution while addressing the requirements of global organizations. • The Art & Science of Performance Tuning of Oracle IAM 11gR2 Real world examples of performance tuning with Oracle IAM • No Where to go but up: Extending the benefits of accelerated IAM Anything is possible, compelling new solutions organizations are unlocking with accelerated Oracle IAM Let’s get started … by talking about the changing dynamics driving these discussions. Big Companies are getting bigger everyday, and increasingly organizations operate across state lines, multiple times zones, and in many countries or continents at the same time.  No longer is midnight to 6am a safe time to take down the system for upgrades, to run recon’s and import or update user accounts and attributes.  Further IT organizations are operating as shared services with SLA’s similar to telephone carrier levels expected by their “clients”.  Workers are moved in and out of roles on a weekly, daily, or even hourly rate and IAM is expected to support those rapid changes.  End users registering for services during business hours in Singapore are expected their access to be green-lighted in custom apps hosted in Portugal within the hour.  Many of the expectations of asynchronous systems and batched updates are not adequate and the number and types of users is growing. When organizations acted more like independent teams at functional or geographic levels it was manageable to have processes that relied on a handful of people who knew how to make things work …. Knew how to get you access to the key systems to get your job done.  Today everyone is expected to do more with less, the finance administrator previously supporting their local Atlanta sales office might now be asked to help close the books for the Johannesburg team, and access certification process once completed monthly by Joan on the 3rd floor is now done by a shared pool of resources in Sao Paulo.   Fragmented processes that rely on institutional knowledge to get access to systems and get work done quickly break down in these scenarios.  Highly robust processes that have automated workflows for connected or disconnected systems give organizations the dynamic flexibility to share work across these lines and cut costs or increase productivity. As the IT industry computing paradigms continue to change with the passing of time, and as mature or proven approaches become clear, it is normal for organizations to adjust accordingly. Businesses must manage identity in an increasingly hybrid world in which legacy on-premises IAM infrastructures are extended or replaced to support more and more interconnected and interdependent services to a wider range of users. The old legacy IAM implementation models we had relied on to manage identities no longer apply. End users expect to self-request access to services from their tablet, get supervisor approval over mobile devices and email, and launch the application even if is hosted on the cloud, or run by a partner, vendor, or service provider. While user expectations are higher, they are also simpler … logging into custom desktop apps to request approvals, or going through email or paper based processes for certification is unacceptable.  Users expect security to operate within the paradigm of the application … i.e. feel like the application they are using. Citizen and customer facing applications have evolved from every where, with custom applications, 3rd party tools, and merging in from acquired entities or 3rd party OEM’s resold to expand your portfolio of services.  These all have their own user stores, authentication models, user lifecycles, session management, etc.  Often the designers/developers are no longer accessible and the documentation is limited.  Bringing together underlying directories to scale for growth, and improve user experience is critical for revenue … but also for operations. Job functions are more dynamic.... take the Olympics for example.  Endless organizations from corporations broadcasting, endorsing, or marketing through the event … to non-profit athletic foundations and public/government entities for athletes and public safety, all operate simultaneously on the world stage.  Each organization needs to spin up short-term teams, often dealing with proprietary information from hot ads to racing strategies or security plans.  IAM is expected to enable team’s to spin up, enable new applications, protect privacy, and secure critical infrastructure.  Then it needs to be disabled just as quickly as users go back to their previous responsibilities. On a more technical level … Optimized system directory; tuning guidelines and parameters are needed by businesses today. Business’s need to be making the right choices (virtual directories) and considerations via choosing the correct architectural patterns (virtual, direct, replicated, and tuning), challenge is that business need to assess and chose the correct architectural patters (centralized, virtualized, and distributed) Today's Business organizations have very complex heterogeneous enterprises that contain diverse and multifaceted information. With today's ever changing global landscape, the strategic end goal in challenging times for business is business agility. The business of identity management requires enterprise's to be more agile and more responsive than ever before. The continued proliferation of networking devices (PC, tablet, PDA's, notebooks, etc.) has caused the number of devices and users to be granted access to these devices to grow exponentially. Business needs to deploy an IAM system that can account for the demands for authentication and authorizations to these devices. Increased innovation is forcing business and organizations to centralize their identity management services. Access management needs to handle traditional web based access as well as handle new innovations around mobile, as well as address insufficient governance processes which can lead to rouge identity accounts, which can then become a source of vulnerabilities within a business’s identity platform. Risk based decisions are providing challenges to business, for an adaptive risk model to make proper access decisions via standard Web single sign on for internal and external customers,. Organizations have to move beyond simple login and passwords to address trusted relationship questions such as: Is this a trusted customer, client, or citizen? Is this a trusted employee, vendor, or partner? Is this a trusted device? Without a solid technological foundation, organizational performance, collaboration, constituent services, or any other organizational processes will languish. A Single server location presents not only network concerns for distributed user base, but identity challenges. The network risks are centered on latency of the long trip that the traffic has to take. Other risks are a performance around availability and if the single identity server is lost, all access is lost. As you can see, there are many reasons why performance tuning IAM will have a substantial impact on the success of your organization.  In our next installment in the series we roll up our sleeves and get into detailed tuning techniques used everyday by thought leaders in the field implementing Oracle Identity & Access Management Solutions.

    Read the article

  • iPhone SDK Push Notification

    - by Craig
    I have setup push notifications in the apple developer panel and added the code to my application. It works fine on the phone using a development profile but if I use a distribution (ad-hoc) profile so that I can give it to a few users for testing it gives an error and crashes, the log gives the following error Code: Thu Jun 25 22:22:35 unknown SpringBoard[729] <Warning>: *** Assertion failure in -[SBRemoteNotificationServer registerApplication:forEnvironment:withTypes:], /SourceCache/SpringBoard/SpringBoard-919.5/SBRemoteNotificationServer.m:633 Thu Jun 25 22:22:35 unknown SpringBoard[729] <Error>: *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'no connection found for environment production' I am using the following code in the app Code: [[UIApplication sharedApplication] registerForRemoteNotificationTypes:(UIRemoteNotificationTypeAlert | UIRemoteNotificationTypeBadge | UIRemoteNotificationTypeSound)]; The thing I don't understand is why it works perfectly using a development profile but with ad-hoc it crashes. Does anyone know what would cause this?, I've tried changing lots of things to try and find the issue but have found nothing.

    Read the article

  • New Feature in ODI 11.1.1.6: ODI for Big Data

    - by Julien Testut
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} By Ananth Tirupattur Starting with Oracle Data Integrator 11.1.1.6.0, ODI is offering a solution to process Big Data. This post provides an overview of this feature. With all the buzz around Big Data and before getting into the details of ODI for Big Data, I will provide a brief introduction to Big Data and Oracle Solution for Big Data. So, what is Big Data? Big data includes: structured data (this includes data from relation data stores, xml data stores), semi-structured data (this includes data from weblogs) unstructured data (this includes data from text blob, images) Traditionally, business decisions are based on the information gathered from transactional data. For example, transactional Data from CRM applications is fed to a decision system for analysis and decision making. Products such as ODI play a key role in enabling decision systems. However, with the emergence of massive amounts of semi-structured and unstructured data it is important for decision system to include them in the analysis to achieve better decision making capability. While there is an abundance of opportunities for business for gaining competitive advantages, process of Big Data has challenges. The challenges of processing Big Data include: Volume of data Velocity of data - The high Rate at which data is generated Variety of data In order to address these challenges and convert them into opportunities, we would need an appropriate framework, platform and the right set of tools. Hadoop is an open source framework which is highly scalable, fault tolerant system, for storage and processing large amounts of data. Hadoop provides 2 key services, distributed and reliable storage called Hadoop Distributed File System or HDFS and a framework for parallel data processing called Map-Reduce. Innovations in Hadoop and its related technology continue to rapidly evolve, hence therefore, it is highly recommended to follow information on the web to keep up with latest information. Oracle's vision is to provide a comprehensive solution to address the challenges faced by Big Data. Oracle is providing the necessary Hardware, software and tools for processing Big Data Oracle solution includes: Big Data Appliance Oracle NoSQL Database Cloudera distribution for Hadoop Oracle R Enterprise- R is a statistical package which is very popular among data scientists. ODI solution for Big Data Oracle Loader for Hadoop for loading data from Hadoop to Oracle. Further details can be found here: http://www.oracle.com/us/products/database/big-data-appliance/overview/index.html ODI Solution for Big Data: ODI’s goal is to minimize the need to understand the complexity of Hadoop framework and simplify the adoption of processing Big Data seamlessly in an enterprise. ODI is providing the capabilities for an integrated architecture for processing Big Data. This includes capability to load data in to Hadoop, process data in Hadoop and load data from Hadoop into Oracle. ODI is expanding its support for Big Data by providing the following out of the box Knowledge Modules (KMs). IKM File to Hive (LOAD DATA).Load unstructured data from File (Local file system or HDFS ) into Hive IKM Hive Control AppendTransform and validate structured data on Hive IKM Hive TransformTransform unstructured data on Hive IKM File/Hive to Oracle (OLH)Load processed data in Hive to Oracle RKM HiveReverse engineer Hive tables to generate models Using the Loading KM you can map files (local and HDFS files) to the corresponding Hive tables. For example, you can map weblog files categorized by date into a corresponding partitioned Hive table schema. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Using the Hive control Append KM you can validate and transform data in Hive. In the below example, two source Hive tables are joined and mapped to a target Hive table. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} The Hive Transform KM facilitates processing of semi-structured data in Hive. In the below example, the data from weblog is processed using a Perl script and mapped to target Hive table. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Using the Oracle Loader for Hadoop (OLH) KM you can load data from Hive table or HDFS to a corresponding table in Oracle. OLH is available as a standalone product. ODI greatly enhances OLH capability by generating the configuration and mapping files for OLH based on the configuration provided in the interface and KM options. ODI seamlessly invokes OLH when executing the scenario. In the below example, a HDFS file is mapped to a table in Oracle. Development and Deployment:The following diagram illustrates the development and deployment of ODI solution for Big Data. Using the ODI Studio on your development machine create and develop ODI solution for processing Big Data by connecting to a MySQL DB or Oracle database on a BDA machine or Hadoop cluster. Schedule the ODI scenarios to be executed on the ODI agent deployed on the BDA machine or Hadoop cluster. ODI Solution for Big Data provides several exciting new capabilities to facilitate the adoption of Big Data in an enterprise. You can find more information about the Oracle Big Data connectors on OTN. You can find an overview of all the new features introduced in ODI 11.1.1.6 in the following document: ODI 11.1.1.6 New Features Overview

    Read the article

  • Install app on Motorola Backflip from AT&T

    - by eric
    I'm trying to test an app out on the Motorola Backflip with AT&T as the carrier. I checked USB debugging on the phone's Development screen. Using Eclipse, how do I get the app to load on the Backflip so I can test it? DDMS shows a device with a bunch of question marks and unkown. Seems that it only gives me the option to load the app on the SD card which doesn't do me any good. I searched and found a Motorola driver which I'm supposed to install to the adb folder. Where is that folder? I've checked the phone and on my development machine. Maybe I need new glasses?

    Read the article

  • NAnt build issues with Mono

    - by calmcajun
    I am trying to build a Mono project using NAnt but I get the error listed below. I have tried altering the environment variable PKG_CONFIG_PATH to include the path leading to the file: mono.pc but that does not seem to work. Failed to initialize the 'Mono 3.5 Profile' (mono-3.5) target framework.: NAnt.Core.BuildException: Failed to initialize the 'Mono 3.5 Profile' (mono-3.5) target framework. ---> Unable to locate 'mono' module using pkg-config. Download the Mono development packages from http://www.mono-project.com/downloads/.: NAnt.Core.BuildException: Unable to locate 'mono' module using pkg-config. Download the Mono development packages from http://www.mono-project.com/downloads/. at NAnt.Core.Tasks.FailTask.ExecuteTask () [0x00000] in <filename unknown>:0 at NAnt.Core.Task.Execute () [0x00000] in <filename unknown>:0 --- End of inner exception stack trace --- at NAnt.Core.FrameworkInfo.Init () [0x00000] in <filename unknown>:0 at NAnt.Core.FrameworkInfo.Validate () [0x00000] in <filename unknown>:0 at NAnt.Core.ProjectSettingsLoader.ConfigureRuntimeFramework () [0x00000] in <filename unknown>:0

    Read the article

  • How can I connect to Android with ADB over TCP?

    - by martinjd
    I am attempting to debug an application on a Motorola Droid but I am having some difficulty connecting to the device via USB. My development server is a Windows 7 64bit VM running in HyperV and so I cannot connect directly via USB in the guest or from the host. I installed a couple of different USB over TCP solutions but the connection appears to have issues since the adb monitor reports "devicemonitor failed to start monitoring" repeatedly. I was wondering if there is a way to connect directly from the client on the development machine to the daemon on the device using the network instead of the usb connection or possibly other viable options?

    Read the article

  • Trouble on setting SSL certificates for Virtual Hosts using Apache\Phusion Passenger in localhost

    - by user502052
    I am using Ruby on Rails 3 and I would like to make to work HTTPS connections on localhost. I am using: Apache v2 + Phusion Passenger Mac OS + Snow Leopard v10.6.6 My Ruby on Rails installation use the Typhoeus gem (it is possible to use the Ruby net\http library but the result doesn't change) to make HTTP requests over HTTPS. I created self-signed ca.key, pjtname.crt and pjtname.key as detailed on the Apple website. Notice: Following instruction from the Apple website, on running the openssl req -new -key server.key -out server.csr command (see the link) at this point Common Name (eg, YOUR name) []: (this is the important one) I entered *pjtname.com so that is valid for all sub_domain of that site. In my Apache httpd.conf I have two virtual hosts configured in this way: # Secure (SSL/TLS) connections #Include /private/etc/apache2/extra/httpd-ssl.conf # # Note: The following must must be present to support # starting without SSL on platforms with no /dev/random equivalent # but a statically compiled-in mod_ssl. # <IfModule ssl_module> SSLRandomSeed startup builtin SSLRandomSeed connect builtin </IfModule> Include /private/etc/apache2/other/*.conf # Passenger configuration LoadModule passenger_module /Users/<my_user_name>/.rvm/gems/ruby-1.9.2-p136/gems/passenger-3.0.2/ext/apache2/mod_passenger.so PassengerRoot /Users/<my_user_name>/.rvm/gems/ruby-1.9.2-p136/gems/passenger-3.0.2 PassengerRuby /Users/<my_user_name>/.rvm/wrappers/ruby-1.9.2-p136/ruby # Go ahead and accept connections for these vhosts # from non-SNI clients SSLStrictSNIVHostCheck off # Ensure that Apache listens on port 443 Listen 443 # Listen for virtual host requests on all IP addresses NameVirtualHost *:80 NameVirtualHost *:443 # # PJTNAME.COM and subdomains SETTING # <VirtualHost *:443> # Because this virtual host is defined first, it will # be used as the default if the hostname is not received # in the SSL handshake, e.g. if the browser doesn't support # SNI. ServerName pjtname.com:443 DocumentRoot "/Users/<my_user_name>/Sites/pjtname.com/pjtname.com/public" ServerAdmin [email protected] ErrorLog "/private/var/log/apache2/error_log" TransferLog "/private/var/log/apache2/access_log" RackEnv development <Directory "/Users/<my_user_name>/Sites/pjtname.com/pjtname.com/public"> Order allow,deny Allow from all </Directory> # SSL Configuration SSLEngine on # Self Signed certificates # Server Certificate SSLCertificateFile /private/etc/apache2/ssl/wildcard.certificate/pjtname.crt # Server Private Key SSLCertificateKeyFile /private/etc/apache2/ssl/wildcard.certificate/pjtname.key # Server Intermediate Bundle SSLCertificateChainFile /private/etc/apache2/ssl/wildcard.certificate/ca.crt </VirtualHost> # HTTP Setting <VirtualHost *:80> ServerName pjtname.com DocumentRoot "/Users/<my_user_name>/Sites/pjtname.com/pjtname.com/public" RackEnv development <Directory "/Users/<my_user_name>/Sites/pjtname.com/pjtname.com/public"> Order allow,deny Allow from all </Directory> </VirtualHost> <VirtualHost *:443> ServerName users.pjtname.com:443 DocumentRoot "/Users/<my_user_name>/Sites/pjtname.com/users.pjtname.com/public" ServerAdmin [email protected] ErrorLog "/private/var/log/apache2/error_log" TransferLog "/private/var/log/apache2/access_log" RackEnv development <Directory "/Users/<my_user_name>/Sites/pjtname.com/users.pjtname.com/public"> Order allow,deny Allow from all </Directory> # SSL Configuration SSLEngine on # Self Signed certificates # Server Certificate SSLCertificateFile /private/etc/apache2/ssl/wildcard.certificate/pjtname.crt # Server Private Key SSLCertificateKeyFile /private/etc/apache2/ssl/wildcard.certificate/pjtname.key # Server Intermediate Bundle SSLCertificateChainFile /private/etc/apache2/ssl/wildcard.certificate/ca.crt </VirtualHost> # HTTP Setting <VirtualHost *:80> ServerName users.pjtname.com DocumentRoot "/Users/<my_user_name>/Sites/pjtname.com/users.pjtname.com/public" RackEnv development <Directory "/Users/<my_user_name>/Sites/pjtname.com/users.pjtname.com/public"> Order allow,deny Allow from all </Directory> </VirtualHost> In the host file I have: ## # Host Database # # localhost is used to configure the loopback interface # when the system is booting. Do not change this entry. ## 127.0.0.1 localhost 255.255.255.255 broadcasthost ::1 localhost fe80::1%lo0 localhost # PJTNAME.COM SETTING 127.0.0.1 pjtname.com 127.0.0.1 users.pjtname.com All seems to work properly because I have already set everything (I think correctly): I generated a wildcard certificate for my domains and sub-domains (in this example: *.pjtname.com) I have set base-named virtualhosts in the http.conf file listening on port :433 and :80 My browser accept certificates also if it alerts me that those aren't safe (notice: I must accept certificates for each domain\sub-domain; that is, [only] at the first time I access a domain or sub-domain over HTTPS I must do the same procedure for acceptance) and I can have access to pages using HTTPS After all this work, when I make a request using Typhoeus (I can use also the Ruby Net::Http library and the result doesn't change) from the pjtname.com RoR application: # Typhoeus request Typhoeus::Request.get("https://users.pjtname.com/") I get something like a warning about the certificate: --- &id001 !ruby/object:Typhoeus::Response app_connect_time: 0.0 body: "" code: 0 connect_time: 0.000625 # Here is the warning curl_error_message: Peer certificate cannot be authenticated with known CA certificates curl_return_code: 60 effective_url: https://users.pjtname.com/ headers: "" http_version: mock: false name_lookup_time: 0.000513 pretransfer_time: 0.0 request: !ruby/object:Typhoeus::Request after_complete: auth_method: body: ... All this means that something is wrong. So, what I have to do to avoid the "Peer certificate cannot be authenticated with known CA certificates" warning and make the HTTPS request to work? Where is\are the error\errors (I think in the Apache configuration, but where?!)? P.S.: if you need some more info, let me know.

    Read the article

  • Apache fop-0.95 error on FopFactory.newInstance() command

    - by FlexInfoSys
    I am using Apache fop-0.95 to build pdf files from a JSP web application on IBM iSeries V5R4 using Websphere 6.0. Everything works perfect in my development using Websphere Development Studio client. When I put the application on the server, I get an error at this line. FopFactory fopFactory = FopFactory.newInstance(); The error is: java.lang.UnsatisfiedLinkError: javax/imageio/ImageIO Does anyone know how I can fix this error? All of the fop class files are part of the EAR file. The files were installed to the projects \WEB-INF\lib directory. I have added the fop jar files to the classpath, using the admin console. I am running IBM WebSphere Application Server - Express, 6.0.2.9 Build Number: cf90614.22 on IBM iSeries V5R4

    Read the article

  • Error on windows using session from appengine-utilities

    - by fredrik
    Hi, I ran across an odd problem while trying to transfer a project to a windows machine. In my project I use a session handler (http://gaeutilities.appspot.com/session) it works fine on my mac but on windows I get: Traceback (most recent call last): File "C:\Program Files (x86)\Google\google_appengine\google\appengine\ext\webapp_init_.py", line 510, in call handler.get(*groups) File "C:\Development\Byggmax.Affiliate\bmaffiliate\admin.py", line 29, in get session = Session() File "C:\Development\Byggmax.Affiliate\bmaffiliate\appengine_utilities\sessions.py", line 547, in init self.cookie.load(string_cookie) File "C:\Python26\lib\Cookie.py", line 628, in load for k, v in rawdata.items(): AttributeError: 'unicode' object has no attribute 'items' Anyone familiar with the Session Handler that knows anything of this? All help are welcome! ..fredrik

    Read the article

  • Running your own GAE server

    - by h2g2java
    The question http://stackoverflow.com/questions/2505265/how-difficult-is-it-to-migrate-away-from-google-app-engine triggered me to think about this issue again. I have read of someone running, production-wise, Google app engine development version on their own server. My questions are: Are there any security issues running GAE development on your own server in production mode and exposing it to the www? If so how to mitigate them? Can GAE dev be run on Amazon? Is it possible to port my GAE apps running on Google servers to a GAE running on Amazon, without code changes, but without changing any reference in using other gdata services such as google docs, youtube, gmail, etc. How to configure GAE dev server to use my own hadoop? Or to use Amazon's hadoop?

    Read the article

  • What technologies should I focus on to work as a developer in Japan?

    - by Atomiton
    I'm thinking of one day moving to Japan and I was wondering if anyone here has any experience working there. I'm curious as to what languages/technology are popular there for web development and software development. I have heard Ruby is/was strong there due to its founder being Japanese. What would you recommend someone focus on if they wanted to work as a developer in Japan? I have heard Microsoft has a strong base in Japan, but my guess is that whatever platform has supported unicode or Shift-JIS the best would be the strongest.

    Read the article

  • Can somebody explain this Objective C method declaration syntax

    - by Doug R
    I'm working through an iPhone development book* without really knowing Objective C. For the most part I'm able to follow what's going on, but there are a few method declarations like the one below that I'm having a bit of trouble parsing. For example: - (NSInteger)tableView:(UITableView *)tableView numberOfRowsInSection:(NSInteger) section { return [self.controllers count]; //controllers is an instance variable of type NSArray in this class } It looks this is a method called numberOfRowsInSection, and it returns an NSInteger, and takes an NSInteger as a parameter which is locally called 'section'. But I don't understand all the references to tableView, or why this takes a parameter when it is not used within the method. Can somebody clarify this? Thanks. *p. 258, Beginning iPhone 3 Development, by Mark and LaMarche, published by Apress

    Read the article

  • ASP.NET MVC 2 - Account controller not found

    - by Chris
    Hi all, I've recently created an ASP.NET MVC 2 application, which works perfectly in the development environment. However, when I deploy it to the server (123-reg Premium Hosting), I can access all of the expected areas - except the Account controller (www.host.info/Account). This then attempts to redirect to the Error.aspx page (www.host.info/Shared/Error.aspx) which it cannot find. I've checked that all of the views have been published, and they're all in the correct place. It seems bizarre that two other controllers can be accessed with no problems, whereas the Account controller cannot be found. I have since renamed the AccountController to SecureController, and all of the dependencies, to no avail. The problem with not being able to find the Error.aspx page also occurs on the development environment. Any ideas would be greatly appreciated. Thanks, Chris

    Read the article

  • dynamic log4net appender name?

    - by sanjeev40084
    Let's say i have 3 smtp appenders in same log4net file whose names are: <appender name = "emailDevelopment".. /> <appender name = "emailBeta".. /> <appender name = "emailProduction".. /> Let's say i have 3 different servers(Dev, Beta, Production). Depending upon the server, i want to fire the log. In case of Development server, it would fire log from "emailDevelopment". I have a system variable in each server named "ApplicationEnvironment" whose value is Development, Beta, Production based on the server names. Now is there anyway i can setup root in log4net so that it fires email depending upon the server name. <root> <priority value="ALL" /> <appender-ref ref="email<environment name from whose appender should be used>" /> </root>

    Read the article

  • iphone/ipad orientation handling

    - by Mark
    This is more of a general question for people to provide me guidance on, basically Im learning iPad/iPhone development and have finally come across the multi-orientation support question. I have looked up a fair amount of doco, and my book "Beginning iPhone 3 Development" has a nice chapter on it. But my question is this, if I was to programatically change my controls (or even use different views for each orientation) how on earth to people maintain their code base? I can just imagine so many issues with spaghetti code/thousands of "if" checks all over the place, that it would drive me nuts to make one small change to the UI arrangement. Does anyone have experience handling this issue? What is a nice way to control it? Thanks a lot Mark

    Read the article

  • Passenger apache default page error

    - by Ganesh Shankar
    Sorry if this is the wrong place to ask this question. I asked it a couple of days ago on Server Fault but am getting no love. (It is sort of related to rails development...) The Question I just installed Passenger and the Passenger Pref Pane on OSX. However, when I try to browse to one of my Rails applications I just get the default Apache "it works!" page. I've checked the vhost definitions and they seem ok so I can't seem to figure out whats wrong... I've tried reinstalling passenger and the pref pane and restarting apache but to no avail. Anyone know how to fix this? My vhost definition looks like this: <VirtualHost *:80> ServerName boilinghot.local DocumentRoot "/Users/ganesh/Code/boilinghot/public" RailsEnv development <Directory "/Users/ganesh/Code/boilinghot/public"> Order allow,deny Allow from all </Directory> </VirtualHost>

    Read the article

  • What's the best BDD framework for working with ASP.NET MVC 2 + C# 4?

    - by Soul_Master
    I just heard about BDD when I watch video of Scott Guthrie in Sweden. One of listener asked question to Scott about How VS2010 and ASP.NET MVC do to support BDD. After that, I search about BDD (Behavior Driven Development) that focus on specification more than unit testing when compares with TDD (Test Driven Development). I found some framework that work with Ruby and Java. But I do not any famous framework for .NET. Please suggest me about BDD framework and summary PROs/CONs of it. PS. Suggested BDD framework must work great on .NET 4, C# 4.0 and ASP.NET MVC 2. Thanks,

    Read the article

< Previous Page | 649 650 651 652 653 654 655 656 657 658 659 660  | Next Page >