Search Results

Search found 11672 results on 467 pages for 'formal methods'.

Page 346/467 | < Previous Page | 342 343 344 345 346 347 348 349 350 351 352 353  | Next Page >

  • Azure Mobile Services: what files does it consist of?

    - by svdoever
    Azure Mobile Services is a platform that provides a small set of functionality consisting of authentication, custom data tables, custom API’s, scheduling scripts and push notifications to be used as the back-end of a mobile application or if you want, any application or web site. As described in my previous post Azure Mobile Services: lessons learned the documentation on what can be used in the custom scripts is a bit minimalistic. The list below of all files the complete Azure Mobile Services platform consists of ca shed some light on what is available in the platform. In following posts I will provide more detailed information on what we can conclude from this list of files. Below are the available files as available in the Azure Mobile Services platform. The bold files are files that describe your data model, api scripts, scheduler scripts and table scripts. Those are the files you configure/construct to provide the “configuration”/implementation of you mobile service. The files are located in a folder like C:\DWASFiles\Sites\youreservice\VirtualDirectory0\site\wwwroot. One file is missing in the list below and that is the event log file C:\DWASFiles\Sites\youreservice\VirtualDirectory0\site\LogFiles\eventlog.xml where your messages written with for example console.log() and exception catched by the system are written. NOTA BENE: the Azure Mobile Services system is a system that is under full development, new releases may change the list of files. ./app.js ./App_Data/config/datamodel.json ./App_Data/config/scripts/api/youreapi.js ./App_Data/config/scripts/api/youreapi.json ./App_Data/config/scripts/scheduler/placeholder ./App_Data/config/scripts/scheduler/youresheduler.js ./App_Data/config/scripts/shared/placeholder ./App_Data/config/scripts/table/placeholder ./App_Data/config/scripts/table/yourtable.insert.js ./App_Data/config/scripts/table/yourtable.update.js ./App_Data/config/scripts/table/yourtable.delete.js ./App_Data/config/scripts/table/yourtable.read.js ./node_modules/apn/index.js ./node_modules/apn/lib/connection.js ./node_modules/apn/lib/device.js ./node_modules/apn/lib/errors.js ./node_modules/apn/lib/feedback.js ./node_modules/apn/lib/notification.js ./node_modules/apn/lib/util.js ./node_modules/apn/node_modules/q/package.json ./node_modules/apn/node_modules/q/q.js ./node_modules/apn/package.json ./node_modules/azure/lib/azure.js ./node_modules/azure/lib/cli/blobUtils.js ./node_modules/azure/lib/cli/cacheUtils.js ./node_modules/azure/lib/cli/callbackAggregator.js ./node_modules/azure/lib/cli/cert.js ./node_modules/azure/lib/cli/channel.js ./node_modules/azure/lib/cli/cli.js ./node_modules/azure/lib/cli/commands/account.js ./node_modules/azure/lib/cli/commands/config.js ./node_modules/azure/lib/cli/commands/deployment.js ./node_modules/azure/lib/cli/commands/deployment_.js ./node_modules/azure/lib/cli/commands/help.js ./node_modules/azure/lib/cli/commands/log.js ./node_modules/azure/lib/cli/commands/log_.js ./node_modules/azure/lib/cli/commands/repository.js ./node_modules/azure/lib/cli/commands/repository_.js ./node_modules/azure/lib/cli/commands/service.js ./node_modules/azure/lib/cli/commands/site.js ./node_modules/azure/lib/cli/commands/site_.js ./node_modules/azure/lib/cli/commands/vm.js ./node_modules/azure/lib/cli/common.js ./node_modules/azure/lib/cli/constants.js ./node_modules/azure/lib/cli/generate-psm1-utils.js ./node_modules/azure/lib/cli/generate-psm1.js ./node_modules/azure/lib/cli/iaas/blobserviceex.js ./node_modules/azure/lib/cli/iaas/deleteImage.js ./node_modules/azure/lib/cli/iaas/image.js ./node_modules/azure/lib/cli/iaas/upload/blobInfo.js ./node_modules/azure/lib/cli/iaas/upload/bufferStream.js ./node_modules/azure/lib/cli/iaas/upload/intSet.js ./node_modules/azure/lib/cli/iaas/upload/jobTracker.js ./node_modules/azure/lib/cli/iaas/upload/pageBlob.js ./node_modules/azure/lib/cli/iaas/upload/streamMerger.js ./node_modules/azure/lib/cli/iaas/upload/uploadVMImage.js ./node_modules/azure/lib/cli/iaas/upload/vhdTools.js ./node_modules/azure/lib/cli/keyFiles.js ./node_modules/azure/lib/cli/patch-winston.js ./node_modules/azure/lib/cli/templates/node/iisnode.yml ./node_modules/azure/lib/cli/utils.js ./node_modules/azure/lib/diagnostics/logger.js ./node_modules/azure/lib/http/webresource.js ./node_modules/azure/lib/serviceruntime/fileinputchannel.js ./node_modules/azure/lib/serviceruntime/goalstatedeserializer.js ./node_modules/azure/lib/serviceruntime/namedpipeinputchannel.js ./node_modules/azure/lib/serviceruntime/namedpipeoutputchannel.js ./node_modules/azure/lib/serviceruntime/protocol1runtimeclient.js ./node_modules/azure/lib/serviceruntime/protocol1runtimecurrentstateclient.js ./node_modules/azure/lib/serviceruntime/protocol1runtimegoalstateclient.js ./node_modules/azure/lib/serviceruntime/roleenvironment.js ./node_modules/azure/lib/serviceruntime/runtimekernel.js ./node_modules/azure/lib/serviceruntime/runtimeversionmanager.js ./node_modules/azure/lib/serviceruntime/runtimeversionprotocolclient.js ./node_modules/azure/lib/serviceruntime/xmlcurrentstateserializer.js ./node_modules/azure/lib/serviceruntime/xmlgoalstatedeserializer.js ./node_modules/azure/lib/serviceruntime/xmlroleenvironmentdatadeserializer.js ./node_modules/azure/lib/services/blob/blobservice.js ./node_modules/azure/lib/services/blob/hmacsha256sign.js ./node_modules/azure/lib/services/blob/models/blobresult.js ./node_modules/azure/lib/services/blob/models/blocklistresult.js ./node_modules/azure/lib/services/blob/models/containeraclresult.js ./node_modules/azure/lib/services/blob/models/containerresult.js ./node_modules/azure/lib/services/blob/models/leaseresult.js ./node_modules/azure/lib/services/blob/models/listblobsresultcontinuation.js ./node_modules/azure/lib/services/blob/models/listcontainersresultcontinuation.js ./node_modules/azure/lib/services/blob/models/servicepropertiesresult.js ./node_modules/azure/lib/services/blob/sharedaccesssignature.js ./node_modules/azure/lib/services/blob/sharedkey.js ./node_modules/azure/lib/services/blob/sharedkeylite.js ./node_modules/azure/lib/services/core/connectionstringparser.js ./node_modules/azure/lib/services/core/exponentialretrypolicyfilter.js ./node_modules/azure/lib/services/core/linearretrypolicyfilter.js ./node_modules/azure/lib/services/core/servicebusserviceclient.js ./node_modules/azure/lib/services/core/servicebussettings.js ./node_modules/azure/lib/services/core/serviceclient.js ./node_modules/azure/lib/services/core/servicemanagementclient.js ./node_modules/azure/lib/services/core/servicemanagementsettings.js ./node_modules/azure/lib/services/core/servicesettings.js ./node_modules/azure/lib/services/core/storageserviceclient.js ./node_modules/azure/lib/services/core/storageservicesettings.js ./node_modules/azure/lib/services/queue/models/listqueuesresultcontinuation.js ./node_modules/azure/lib/services/queue/models/queuemessageresult.js ./node_modules/azure/lib/services/queue/models/queueresult.js ./node_modules/azure/lib/services/queue/models/servicepropertiesresult.js ./node_modules/azure/lib/services/queue/queueservice.js ./node_modules/azure/lib/services/serviceBus/models/acstokenresult.js ./node_modules/azure/lib/services/serviceBus/models/queuemessageresult.js ./node_modules/azure/lib/services/serviceBus/models/queueresult.js ./node_modules/azure/lib/services/serviceBus/models/ruleresult.js ./node_modules/azure/lib/services/serviceBus/models/subscriptionresult.js ./node_modules/azure/lib/services/serviceBus/models/topicresult.js ./node_modules/azure/lib/services/serviceBus/servicebusservice.js ./node_modules/azure/lib/services/serviceBus/wrap.js ./node_modules/azure/lib/services/serviceBus/wrapservice.js ./node_modules/azure/lib/services/serviceBus/wraptokenmanager.js ./node_modules/azure/lib/services/serviceManagement/models/roleparser.js ./node_modules/azure/lib/services/serviceManagement/models/roleschema.json ./node_modules/azure/lib/services/serviceManagement/models/servicemanagementserialize.js ./node_modules/azure/lib/services/serviceManagement/servicemanagementservice.js ./node_modules/azure/lib/services/table/batchserviceclient.js ./node_modules/azure/lib/services/table/models/entityresult.js ./node_modules/azure/lib/services/table/models/queryentitiesresultcontinuation.js ./node_modules/azure/lib/services/table/models/querytablesresultcontinuation.js ./node_modules/azure/lib/services/table/models/servicepropertiesresult.js ./node_modules/azure/lib/services/table/models/tableresult.js ./node_modules/azure/lib/services/table/sharedkeylitetable.js ./node_modules/azure/lib/services/table/sharedkeytable.js ./node_modules/azure/lib/services/table/tablequery.js ./node_modules/azure/lib/services/table/tableservice.js ./node_modules/azure/lib/util/atomhandler.js ./node_modules/azure/lib/util/certificates/der.js ./node_modules/azure/lib/util/certificates/pkcs.js ./node_modules/azure/lib/util/constants.js ./node_modules/azure/lib/util/iso8061date.js ./node_modules/azure/lib/util/js2xml.js ./node_modules/azure/lib/util/rfc1123date.js ./node_modules/azure/lib/util/util.js ./node_modules/azure/lib/util/validate.js ./node_modules/azure/LICENSE.txt ./node_modules/azure/node_modules/async/index.js ./node_modules/azure/node_modules/async/lib/async.js ./node_modules/azure/node_modules/async/LICENSE ./node_modules/azure/node_modules/async/package.json ./node_modules/azure/node_modules/azure/lib/azure.js ./node_modules/azure/node_modules/azure/lib/diagnostics/logger.js ./node_modules/azure/node_modules/azure/lib/http/webresource.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/fileinputchannel.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/goalstatedeserializer.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/namedpipeinputchannel.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/namedpipeoutputchannel.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/protocol1runtimeclient.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/protocol1runtimecurrentstateclient.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/protocol1runtimegoalstateclient.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/roleenvironment.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/runtimekernel.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/runtimeversionmanager.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/runtimeversionprotocolclient.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/xmlcurrentstateserializer.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/xmlgoalstatedeserializer.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/xmlroleenvironmentdatadeserializer.js ./node_modules/azure/node_modules/azure/lib/services/blob/blobservice.js ./node_modules/azure/node_modules/azure/lib/services/blob/internal/sharedaccesssignature.js ./node_modules/azure/node_modules/azure/lib/services/blob/internal/sharedkey.js ./node_modules/azure/node_modules/azure/lib/services/blob/internal/sharedkeylite.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/blobresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/blocklistresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/containeraclresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/containerresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/leaseresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/listblobsresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/listcontainersresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/servicepropertiesresult.js ./node_modules/azure/node_modules/azure/lib/services/core/connectionstringparser.js ./node_modules/azure/node_modules/azure/lib/services/core/exponentialretrypolicyfilter.js ./node_modules/azure/node_modules/azure/lib/services/core/hmacsha256sign.js ./node_modules/azure/node_modules/azure/lib/services/core/linearretrypolicyfilter.js ./node_modules/azure/node_modules/azure/lib/services/core/servicebusserviceclient.js ./node_modules/azure/node_modules/azure/lib/services/core/servicebussettings.js ./node_modules/azure/node_modules/azure/lib/services/core/serviceclient.js ./node_modules/azure/node_modules/azure/lib/services/core/servicemanagementclient.js ./node_modules/azure/node_modules/azure/lib/services/core/servicemanagementsettings.js ./node_modules/azure/node_modules/azure/lib/services/core/servicesettings.js ./node_modules/azure/node_modules/azure/lib/services/core/sqlserviceclient.js ./node_modules/azure/node_modules/azure/lib/services/core/storageserviceclient.js ./node_modules/azure/node_modules/azure/lib/services/core/storageservicesettings.js ./node_modules/azure/node_modules/azure/lib/services/queue/models/listqueuesresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/queue/models/queuemessageresult.js ./node_modules/azure/node_modules/azure/lib/services/queue/models/queueresult.js ./node_modules/azure/node_modules/azure/lib/services/queue/models/servicepropertiesresult.js ./node_modules/azure/node_modules/azure/lib/services/queue/queueservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/apnsservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/gcmservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/internal/sharedaccesssignature.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/internal/wrap.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/internal/wraptokenmanager.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/acstokenresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/notificationhubresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/queuemessageresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/queueresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/registrationresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/resourceresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/ruleresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/subscriptionresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/topicresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/notificationhubservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/servicebusservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/servicebusservicebase.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/wnsservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/wrapservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/hdinsightservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/models/roleparser.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/models/roleschema.json ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/models/servicemanagementserialize.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/servicebusmanagementservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/servicemanagementservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/sqlmanagementservice.js ./node_modules/azure/node_modules/azure/lib/services/sqlAzure/models/databaseresult.js ./node_modules/azure/node_modules/azure/lib/services/sqlAzure/sqlserveracs.js ./node_modules/azure/node_modules/azure/lib/services/sqlAzure/sqlservice.js ./node_modules/azure/node_modules/azure/lib/services/table/batchserviceclient.js ./node_modules/azure/node_modules/azure/lib/services/table/internal/sharedkeylitetable.js ./node_modules/azure/node_modules/azure/lib/services/table/internal/sharedkeytable.js ./node_modules/azure/node_modules/azure/lib/services/table/models/entityresult.js ./node_modules/azure/node_modules/azure/lib/services/table/models/listresult.js ./node_modules/azure/node_modules/azure/lib/services/table/models/queryentitiesresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/table/models/querytablesresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/table/models/servicepropertiesresult.js ./node_modules/azure/node_modules/azure/lib/services/table/models/tableresult.js ./node_modules/azure/node_modules/azure/lib/services/table/tablequery.js ./node_modules/azure/node_modules/azure/lib/services/table/tableservice.js ./node_modules/azure/node_modules/azure/lib/util/atomhandler.js ./node_modules/azure/node_modules/azure/lib/util/constants.js ./node_modules/azure/node_modules/azure/lib/util/date.js ./node_modules/azure/node_modules/azure/lib/util/edmtype.js ./node_modules/azure/node_modules/azure/lib/util/iso8061date.js ./node_modules/azure/node_modules/azure/lib/util/js2xml.js ./node_modules/azure/node_modules/azure/lib/util/odatahandler.js ./node_modules/azure/node_modules/azure/lib/util/rfc1123date.js ./node_modules/azure/node_modules/azure/lib/util/util.js ./node_modules/azure/node_modules/azure/lib/util/validate.js ./node_modules/azure/node_modules/azure/LICENSE.txt ./node_modules/azure/node_modules/azure/node_modules/wns/lib/wns.js ./node_modules/azure/node_modules/azure/node_modules/wns/LICENSE.txt ./node_modules/azure/node_modules/azure/node_modules/wns/package.json ./node_modules/azure/node_modules/azure/node_modules/xml2js/lib/xml2js.js ./node_modules/azure/node_modules/azure/node_modules/xml2js/LICENSE ./node_modules/azure/node_modules/azure/node_modules/xml2js/node_modules/sax/lib/sax.js ./node_modules/azure/node_modules/azure/node_modules/xml2js/node_modules/sax/LICENSE ./node_modules/azure/node_modules/azure/node_modules/xml2js/node_modules/sax/package.json ./node_modules/azure/node_modules/azure/node_modules/xml2js/package.json ./node_modules/azure/node_modules/azure/package.json ./node_modules/azure/node_modules/colors/colors.js ./node_modules/azure/node_modules/colors/MIT-LICENSE.txt ./node_modules/azure/node_modules/colors/package.json ./node_modules/azure/node_modules/commander/index.js ./node_modules/azure/node_modules/commander/lib/commander.js ./node_modules/azure/node_modules/commander/node_modules/keypress/index.js ./node_modules/azure/node_modules/commander/node_modules/keypress/package.json ./node_modules/azure/node_modules/commander/package.json ./node_modules/azure/node_modules/dateformat/lib/dateformat.js ./node_modules/azure/node_modules/dateformat/package.json ./node_modules/azure/node_modules/easy-table/lib/table.js ./node_modules/azure/node_modules/easy-table/package.json ./node_modules/azure/node_modules/eyes/lib/eyes.js ./node_modules/azure/node_modules/eyes/LICENSE ./node_modules/azure/node_modules/eyes/package.json ./node_modules/azure/node_modules/log/index.js ./node_modules/azure/node_modules/log/lib/log.js ./node_modules/azure/node_modules/log/package.json ./node_modules/azure/node_modules/mime/LICENSE ./node_modules/azure/node_modules/mime/mime.js ./node_modules/azure/node_modules/mime/package.json ./node_modules/azure/node_modules/mime/types/mime.types ./node_modules/azure/node_modules/mime/types/node.types ./node_modules/azure/node_modules/node-uuid/LICENSE.md ./node_modules/azure/node_modules/node-uuid/package.json ./node_modules/azure/node_modules/node-uuid/uuid.js ./node_modules/azure/node_modules/qs/component.json ./node_modules/azure/node_modules/qs/index.js ./node_modules/azure/node_modules/qs/lib/head.js ./node_modules/azure/node_modules/qs/lib/querystring.js ./node_modules/azure/node_modules/qs/lib/tail.js ./node_modules/azure/node_modules/qs/package.json ./node_modules/azure/node_modules/qs/querystring.js ./node_modules/azure/node_modules/request/aws.js ./node_modules/azure/node_modules/request/forever.js ./node_modules/azure/node_modules/request/LICENSE ./node_modules/azure/node_modules/request/main.js ./node_modules/azure/node_modules/request/node_modules/form-data/lib/form_data.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/async/index.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/async/lib/async.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/async/LICENSE ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/async/package.json ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/lib/combined_stream.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/License ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/lib/delayed_stream.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/License ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/package.json ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/package.json ./node_modules/azure/node_modules/request/node_modules/form-data/package.json ./node_modules/azure/node_modules/request/node_modules/mime/LICENSE ./node_modules/azure/node_modules/request/node_modules/mime/mime.js ./node_modules/azure/node_modules/request/node_modules/mime/package.json ./node_modules/azure/node_modules/request/node_modules/mime/types/mime.types ./node_modules/azure/node_modules/request/node_modules/mime/types/node.types ./node_modules/azure/node_modules/request/oauth.js ./node_modules/azure/node_modules/request/package.json ./node_modules/azure/node_modules/request/tunnel.js ./node_modules/azure/node_modules/request/uuid.js ./node_modules/azure/node_modules/request/vendor/cookie/index.js ./node_modules/azure/node_modules/request/vendor/cookie/jar.js ./node_modules/azure/node_modules/sax/lib/sax.js ./node_modules/azure/node_modules/sax/LICENSE ./node_modules/azure/node_modules/sax/package.json ./node_modules/azure/node_modules/streamline/AUTHORS ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/decompiler.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/definitions.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsbrowser.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsdecomp.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsdefs.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsexec.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jslex.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsparse.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/lexer.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/parser.js ./node_modules/azure/node_modules/streamline/deps/narcissus/LICENSE ./node_modules/azure/node_modules/streamline/deps/narcissus/main.js ./node_modules/azure/node_modules/streamline/deps/narcissus/package.json ./node_modules/azure/node_modules/streamline/deps/narcissus/xfail/narcissus-failures.txt ./node_modules/azure/node_modules/streamline/deps/narcissus/xfail/narcissus-slow.txt ./node_modules/azure/node_modules/streamline/lib/callbacks/format.js ./node_modules/azure/node_modules/streamline/lib/callbacks/require-stub.js ./node_modules/azure/node_modules/streamline/lib/callbacks/runtime.js ./node_modules/azure/node_modules/streamline/lib/callbacks/transform.js ./node_modules/azure/node_modules/streamline/lib/compile.js ./node_modules/azure/node_modules/streamline/lib/compiler/command.js ./node_modules/azure/node_modules/streamline/lib/compiler/compile--fibers.js ./node_modules/azure/node_modules/streamline/lib/compiler/compile.js ./node_modules/azure/node_modules/streamline/lib/compiler/compile_.js ./node_modules/azure/node_modules/streamline/lib/compiler/index.js ./node_modules/azure/node_modules/streamline/lib/compiler/register.js ./node_modules/azure/node_modules/streamline/lib/fibers/runtime.js ./node_modules/azure/node_modules/streamline/lib/fibers/transform.js ./node_modules/azure/node_modules/streamline/lib/fibers/walker.js ./node_modules/azure/node_modules/streamline/lib/globals.js ./node_modules/azure/node_modules/streamline/lib/index.js ./node_modules/azure/node_modules/streamline/lib/register.js ./node_modules/azure/node_modules/streamline/lib/require/client/require.js ./node_modules/azure/node_modules/streamline/lib/require/server/depend.js ./node_modules/azure/node_modules/streamline/lib/require/server/require.js ./node_modules/azure/node_modules/streamline/lib/streams/client/streams--fibers.js ./node_modules/azure/node_modules/streamline/lib/streams/client/streams.js ./node_modules/azure/node_modules/streamline/lib/streams/client/streams_.js ./node_modules/azure/node_modules/streamline/lib/streams/jsonRequest.js ./node_modules/azure/node_modules/streamline/lib/streams/readers.js ./node_modules/azure/node_modules/streamline/lib/streams/server/httpHelper.js ./node_modules/azure/node_modules/streamline/lib/streams/server/streams.js ./node_modules/azure/node_modules/streamline/lib/streams/streams.js ./node_modules/azure/node_modules/streamline/lib/tools/docTool.js ./node_modules/azure/node_modules/streamline/lib/transform.js ./node_modules/azure/node_modules/streamline/lib/util/flows--fibers.js ./node_modules/azure/node_modules/streamline/lib/util/flows.js ./node_modules/azure/node_modules/streamline/lib/util/flows_.js ./node_modules/azure/node_modules/streamline/lib/util/future.js ./node_modules/azure/node_modules/streamline/lib/util/index.js ./node_modules/azure/node_modules/streamline/lib/util/url.js ./node_modules/azure/node_modules/streamline/lib/util/uuid.js ./node_modules/azure/node_modules/streamline/module.js ./node_modules/azure/node_modules/streamline/package.json ./node_modules/azure/node_modules/tunnel/index.js ./node_modules/azure/node_modules/tunnel/lib/tunnel.js ./node_modules/azure/node_modules/tunnel/package.json ./node_modules/azure/node_modules/underscore/index.js ./node_modules/azure/node_modules/underscore/LICENSE ./node_modules/azure/node_modules/underscore/package.json ./node_modules/azure/node_modules/underscore/underscore.js ./node_modules/azure/node_modules/underscore.string/lib/underscore.string.js ./node_modules/azure/node_modules/underscore.string/package.json ./node_modules/azure/node_modules/validator/index.js ./node_modules/azure/node_modules/validator/lib/defaultError.js ./node_modules/azure/node_modules/validator/lib/entities.js ./node_modules/azure/node_modules/validator/lib/filter.js ./node_modules/azure/node_modules/validator/lib/index.js ./node_modules/azure/node_modules/validator/lib/validator.js ./node_modules/azure/node_modules/validator/lib/validators.js ./node_modules/azure/node_modules/validator/lib/xss.js ./node_modules/azure/node_modules/validator/LICENSE ./node_modules/azure/node_modules/validator/package.json ./node_modules/azure/node_modules/validator/validator.js ./node_modules/azure/node_modules/winston/lib/winston/common.js ./node_modules/azure/node_modules/winston/lib/winston/config/cli-config.js ./node_modules/azure/node_modules/winston/lib/winston/config/npm-config.js ./node_modules/azure/node_modules/winston/lib/winston/config/syslog-config.js ./node_modules/azure/node_modules/winston/lib/winston/config.js ./node_modules/azure/node_modules/winston/lib/winston/container.js ./node_modules/azure/node_modules/winston/lib/winston/exception.js ./node_modules/azure/node_modules/winston/lib/winston/logger.js ./node_modules/azure/node_modules/winston/lib/winston/transports/console.js ./node_modules/azure/node_modules/winston/lib/winston/transports/file.js ./node_modules/azure/node_modules/winston/lib/winston/transports/http.js ./node_modules/azure/node_modules/winston/lib/winston/transports/transport.js ./node_modules/azure/node_modules/winston/lib/winston/transports/webhook.js ./node_modules/azure/node_modules/winston/lib/winston/transports.js ./node_modules/azure/node_modules/winston/lib/winston.js ./node_modules/azure/node_modules/winston/LICENSE ./node_modules/azure/node_modules/winston/node_modules/cycle/cycle.js ./node_modules/azure/node_modules/winston/node_modules/cycle/package.json ./node_modules/azure/node_modules/winston/node_modules/pkginfo/lib/pkginfo.js ./node_modules/azure/node_modules/winston/node_modules/pkginfo/package.json ./node_modules/azure/node_modules/winston/node_modules/request/aws.js ./node_modules/azure/node_modules/winston/node_modules/request/aws2.js ./node_modules/azure/node_modules/winston/node_modules/request/forever.js ./node_modules/azure/node_modules/winston/node_modules/request/LICENSE ./node_modules/azure/node_modules/winston/node_modules/request/main.js ./node_modules/azure/node_modules/winston/node_modules/request/mimetypes.js ./node_modules/azure/node_modules/winston/node_modules/request/oauth.js ./node_modules/azure/node_modules/winston/node_modules/request/package.json ./node_modules/azure/node_modules/winston/node_modules/request/tunnel.js ./node_modules/azure/node_modules/winston/node_modules/request/uuid.js ./node_modules/azure/node_modules/winston/node_modules/request/vendor/cookie/index.js ./node_modules/azure/node_modules/winston/node_modules/request/vendor/cookie/jar.js ./node_modules/azure/node_modules/winston/node_modules/stack-trace/lib/stack-trace.js ./node_modules/azure/node_modules/winston/node_modules/stack-trace/License ./node_modules/azure/node_modules/winston/node_modules/stack-trace/package.json ./node_modules/azure/node_modules/winston/package.json ./node_modules/azure/node_modules/xml2js/lib/xml2js.js ./node_modules/azure/node_modules/xml2js/LICENSE ./node_modules/azure/node_modules/xml2js/package.json ./node_modules/azure/node_modules/xmlbuilder/lib/index.js ./node_modules/azure/node_modules/xmlbuilder/lib/XMLBuilder.js ./node_modules/azure/node_modules/xmlbuilder/lib/XMLFragment.js ./node_modules/azure/node_modules/xmlbuilder/package.json ./node_modules/azure/package.json ./node_modules/dpush/lib/dpush.js ./node_modules/dpush/LICENSE.txt ./node_modules/dpush/package.json ./node_modules/express/.npmignore ./node_modules/express/.travis.yml ./node_modules/express/bin/express ./node_modules/express/History.md ./node_modules/express/index.js ./node_modules/express/lib/application.js ./node_modules/express/lib/express.js ./node_modules/express/lib/middleware.js ./node_modules/express/lib/request.js ./node_modules/express/lib/response.js ./node_modules/express/lib/router/index.js ./node_modules/express/lib/router/route.js ./node_modules/express/lib/utils.js ./node_modules/express/lib/view.js ./node_modules/express/LICENSE ./node_modules/express/Makefile ./node_modules/express/node_modules/buffer-crc32/.npmignore ./node_modules/express/node_modules/buffer-crc32/.travis.yml ./node_modules/express/node_modules/buffer-crc32/index.js ./node_modules/express/node_modules/buffer-crc32/package.json ./node_modules/express/node_modules/buffer-crc32/README.md ./node_modules/express/node_modules/buffer-crc32/tests/crc.test.js ./node_modules/express/node_modules/commander/.npmignore ./node_modules/express/node_modules/commander/.travis.yml ./node_modules/express/node_modules/commander/History.md ./node_modules/express/node_modules/commander/index.js ./node_modules/express/node_modules/commander/lib/commander.js ./node_modules/express/node_modules/commander/Makefile ./node_modules/express/node_modules/commander/package.json ./node_modules/express/node_modules/commander/Readme.md ./node_modules/express/node_modules/connect/.npmignore ./node_modules/express/node_modules/connect/.travis.yml ./node_modules/express/node_modules/connect/index.js ./node_modules/express/node_modules/connect/lib/cache.js ./node_modules/express/node_modules/connect/lib/connect.js ./node_modules/express/node_modules/connect/lib/index.js ./node_modules/express/node_modules/connect/lib/middleware/basicAuth.js ./node_modules/express/node_modules/connect/lib/middleware/bodyParser.js ./node_modules/express/node_modules/connect/lib/middleware/compress.js ./node_modules/express/node_modules/connect/lib/middleware/cookieParser.js ./node_modules/express/node_modules/connect/lib/middleware/cookieSession.js ./node_modules/express/node_modules/connect/lib/middleware/csrf.js ./node_modules/express/node_modules/connect/lib/middleware/directory.js ./node_modules/express/node_modules/connect/lib/middleware/errorHandler.js ./node_modules/express/node_modules/connect/lib/middleware/favicon.js ./node_modules/express/node_modules/connect/lib/middleware/json.js ./node_modules/express/node_modules/connect/lib/middleware/limit.js ./node_modules/express/node_modules/connect/lib/middleware/logger.js ./node_modules/express/node_modules/connect/lib/middleware/methodOverride.js ./node_modules/express/node_modules/connect/lib/middleware/multipart.js ./node_modules/express/node_modules/connect/lib/middleware/query.js ./node_modules/express/node_modules/connect/lib/middleware/responseTime.js ./node_modules/express/node_modules/connect/lib/middleware/session/cookie.js ./node_modules/express/node_modules/connect/lib/middleware/session/memory.js ./node_modules/express/node_modules/connect/lib/middleware/session/session.js ./node_modules/express/node_modules/connect/lib/middleware/session/store.js ./node_modules/express/node_modules/connect/lib/middleware/session.js ./node_modules/express/node_modules/connect/lib/middleware/static.js ./node_modules/express/node_modules/connect/lib/middleware/staticCache.js ./node_modules/express/node_modules/connect/lib/middleware/timeout.js ./node_modules/express/node_modules/connect/lib/middleware/urlencoded.js ./node_modules/express/node_modules/connect/lib/middleware/vhost.js ./node_modules/express/node_modules/connect/lib/patch.js ./node_modules/express/node_modules/connect/lib/proto.js ./node_modules/express/node_modules/connect/lib/public/directory.html ./node_modules/express/node_modules/connect/lib/public/error.html ./node_modules/express/node_modules/connect/lib/public/favicon.ico ./node_modules/express/node_modules/connect/lib/public/icons/page.png ./node_modules/express/node_modules/connect/lib/public/icons/page_add.png ./node_modules/express/node_modules/connect/lib/public/icons/page_attach.png ./node_modules/express/node_modules/connect/lib/public/icons/page_code.png ./node_modules/express/node_modules/connect/lib/public/icons/page_copy.png ./node_modules/express/node_modules/connect/lib/public/icons/page_delete.png ./node_modules/express/node_modules/connect/lib/public/icons/page_edit.png ./node_modules/express/node_modules/connect/lib/public/icons/page_error.png ./node_modules/express/node_modules/connect/lib/public/icons/page_excel.png ./node_modules/express/node_modules/connect/lib/public/icons/page_find.png ./node_modules/express/node_modules/connect/lib/public/icons/page_gear.png ./node_modules/express/node_modules/connect/lib/public/icons/page_go.png ./node_modules/express/node_modules/connect/lib/public/icons/page_green.png ./node_modules/express/node_modules/connect/lib/public/icons/page_key.png ./node_modules/express/node_modules/connect/lib/public/icons/page_lightning.png ./node_modules/express/node_modules/connect/lib/public/icons/page_link.png ./node_modules/express/node_modules/connect/lib/public/icons/page_paintbrush.png ./node_modules/express/node_modules/connect/lib/public/icons/page_paste.png ./node_modules/express/node_modules/connect/lib/public/icons/page_red.png ./node_modules/express/node_modules/connect/lib/public/icons/page_refresh.png ./node_modules/express/node_modules/connect/lib/public/icons/page_save.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_acrobat.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_actionscript.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_add.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_c.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_camera.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_cd.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_code.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_code_red.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_coldfusion.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_compressed.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_copy.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_cplusplus.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_csharp.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_cup.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_database.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_delete.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_dvd.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_edit.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_error.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_excel.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_find.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_flash.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_freehand.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_gear.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_get.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_go.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_h.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_horizontal.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_key.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_lightning.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_link.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_magnify.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_medal.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_office.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_paint.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_paintbrush.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_paste.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_php.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_picture.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_powerpoint.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_put.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_ruby.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_stack.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_star.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_swoosh.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_text.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_text_width.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_tux.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_vector.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_visualstudio.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_width.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_word.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_world.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_wrench.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_zip.png ./node_modules/express/node_modules/connect/lib/public/icons/page_word.png ./node_modules/express/node_modules/connect/lib/public/icons/page_world.png ./node_modules/express/node_modules/connect/lib/public/style.css ./node_modules/express/node_modules/connect/lib/utils.js ./node_modules/express/node_modules/connect/LICENSE ./node_modules/express/node_modules/connect/node_modules/bytes/.npmignore ./node_modules/express/node_modules/connect/node_modules/bytes/component.json ./node_modules/express/node_modules/connect/node_modules/bytes/History.md ./node_modules/express/node_modules/connect/node_modules/bytes/index.js ./node_modules/express/node_modules/connect/node_modules/bytes/Makefile ./node_modules/express/node_modules/connect/node_modules/bytes/package.json ./node_modules/express/node_modules/connect/node_modules/bytes/Readme.md ./node_modules/express/node_modules/connect/node_modules/formidable/.npmignore ./node_modules/express/node_modules/connect/node_modules/formidable/.travis.yml ./node_modules/express/node_modules/connect/node_modules/formidable/benchmark/bench-multipart-parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/example/json.js ./node_modules/express/node_modules/connect/node_modules/formidable/example/post.js ./node_modules/express/node_modules/connect/node_modules/formidable/example/upload.js ./node_modules/express/node_modules/connect/node_modules/formidable/index.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/file.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/incoming_form.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/index.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/json_parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/multipart_parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/octet_parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/querystring_parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/LICENSE ./node_modules/express/node_modules/connect/node_modules/formidable/package.json ./node_modules/express/node_modules/connect/node_modules/formidable/Readme.md ./node_modules/express/node_modules/connect/node_modules/formidable/test/common.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/beta-sticker-1.png ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/binaryfile.tar.gz ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/blank.gif ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/funkyfilename.txt ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/menu_separator.png ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/plain.txt ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/http/special-chars-in-filename/info.md ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/encoding.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/misc.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/no-filename.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/preamble.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/special-chars-in-filename.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/workarounds.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/multipart.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/integration/test-fixtures.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/integration/test-json.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/integration/test-octet-stream.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/common.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/integration/test-multipart-parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/simple/test-file.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/simple/test-incoming-form.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/simple/test-multipart-parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/simple/test-querystring-parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/system/test-multi-video-upload.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/run.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/standalone/test-connection-aborted.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/standalone/test-content-transfer-encoding.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/standalone/test-issue-46.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/tools/base64.html ./node_modules/express/node_modules/connect/node_modules/formidable/test/unit/test-file.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/unit/test-incoming-form.js ./node_modules/express/node_modules/connect/node_modules/formidable/tool/record.js ./node_modules/express/node_modules/connect/node_modules/pause/.npmignore ./node_modules/express/node_modules/connect/node_modules/pause/History.md ./node_modules/express/node_modules/connect/node_modules/pause/index.js ./node_modules/express/node_modules/connect/node_modules/pause/Makefile ./node_modules/express/node_modules/connect/node_modules/pause/package.json ./node_modules/express/node_modules/connect/node_modules/pause/Readme.md ./node_modules/express/node_modules/connect/node_modules/qs/.gitmodules ./node_modules/express/node_modules/connect/node_modules/qs/.npmignore ./node_modules/express/node_modules/connect/node_modules/qs/index.js ./node_modules/express/node_modules/connect/node_modules/qs/package.json ./node_modules/express/node_modules/connect/node_modules/qs/Readme.md ./node_modules/express/node_modules/connect/package.json ./node_modules/express/node_modules/connect/test.js ./node_modules/express/node_modules/cookie/.npmignore ./node_modules/express/node_modules/cookie/.travis.yml ./node_modules/express/node_modules/cookie/index.js ./node_modules/express/node_modules/cookie/package.json ./node_modules/express/node_modules/cookie/README.md ./node_modules/express/node_modules/cookie/test/mocha.opts ./node_modules/express/node_modules/cookie/test/parse.js ./node_modules/express/node_modules/cookie/test/serialize.js ./node_modules/express/node_modules/cookie-signature/.npmignore ./node_modules/express/node_modules/cookie-signature/History.md ./node_modules/express/node_modules/cookie-signature/index.js ./node_modules/express/node_modules/cookie-signature/Makefile ./node_modules/express/node_modules/cookie-signature/package.json ./node_modules/express/node_modules/cookie-signature/Readme.md ./node_modules/express/node_modules/debug/.npmignore ./node_modules/express/node_modules/debug/component.json ./node_modules/express/node_modules/debug/debug.js ./node_modules/express/node_modules/debug/example/app.js ./node_modules/express/node_modules/debug/example/browser.html ./node_modules/express/node_modules/debug/example/wildcards.js ./node_modules/express/node_modules/debug/example/worker.js ./node_modules/express/node_modules/debug/History.md ./node_modules/express/node_modules/debug/index.js ./node_modules/express/node_modules/debug/lib/debug.js ./node_modules/express/node_modules/debug/package.json ./node_modules/express/node_modules/debug/Readme.md ./node_modules/express/node_modules/fresh/.npmignore ./node_modules/express/node_modules/fresh/index.js ./node_modules/express/node_modules/fresh/Makefile ./node_modules/express/node_modules/fresh/package.json ./node_modules/express/node_modules/fresh/Readme.md ./node_modules/express/node_modules/methods/index.js ./node_modules/express/node_modules/methods/package.json ./node_modules/express/node_modules/mkdirp/.npmignore ./node_modules/express/node_modules/mkdirp/.travis.yml ./node_modules/express/node_modules/mkdirp/examples/pow.js ./node_modules/express/node_modules/mkdirp/index.js ./node_modules/express/node_modules/mkdirp/LICENSE ./node_modules/express/node_modules/mkdirp/package.json ./node_modules/express/node_modules/mkdirp/README.markdown ./node_modules/express/node_modules/mkdirp/test/chmod.js ./node_modules/express/node_modules/mkdirp/test/clobber.js ./node_modules/express/node_modules/mkdirp/test/mkdirp.js ./node_modules/express/node_modules/mkdirp/test/perm.js ./node_modules/express/node_modules/mkdirp/test/perm_sync.js ./node_modules/express/node_modules/mkdirp/test/race.js ./node_modules/express/node_modules/mkdirp/test/rel.js ./node_modules/express/node_modules/mkdirp/test/return.js ./node_modules/express/node_modules/mkdirp/test/return_sync.js ./node_modules/express/node_modules/mkdirp/test/root.js ./node_modules/express/node_modules/mkdirp/test/sync.js ./node_modules/express/node_modules/mkdirp/test/umask.js ./node_modules/express/node_modules/mkdirp/test/umask_sync.js ./node_modules/express/node_modules/range-parser/.npmignore ./node_modules/express/node_modules/range-parser/History.md ./node_modules/express/node_modules/range-parser/index.js ./node_modules/express/node_modules/range-parser/Makefile ./node_modules/express/node_modules/range-parser/package.json ./node_modules/express/node_modules/range-parser/Readme.md ./node_modules/express/node_modules/send/.npmignore ./node_modules/express/node_modules/send/History.md ./node_modules/express/node_modules/send/index.js ./node_modules/express/node_modules/send/lib/send.js ./node_modules/express/node_modules/send/lib/utils.js ./node_modules/express/node_modules/send/Makefile ./node_modules/express/node_modules/send/node_modules/mime/LICENSE ./node_modules/express/node_modules/send/node_modules/mime/mime.js ./node_modules/express/node_modules/send/node_modules/mime/package.json ./node_modules/express/node_modules/send/node_modules/mime/README.md ./node_modules/express/node_modules/send/node_modules/mime/test.js ./node_modules/express/node_modules/send/node_modules/mime/types/mime.types ./node_modules/express/node_modules/send/node_modules/mime/types/node.types ./node_modules/express/node_modules/send/package.json ./node_modules/express/node_modules/send/Readme.md ./node_modules/express/package.json ./node_modules/express/Readme.md ./node_modules/mpns/lib/mpns.js ./node_modules/mpns/package.json ./node_modules/oauth/index.js ./node_modules/oauth/lib/oauth.js ./node_modules/oauth/lib/oauth2.js ./node_modules/oauth/lib/sha1.js ./node_modules/oauth/lib/_utils.js ./node_modules/oauth/LICENSE ./node_modules/oauth/package.json ./node_modules/pusher/index.js ./node_modules/pusher/lib/pusher.js ./node_modules/pusher/node_modules/request/aws.js ./node_modules/pusher/node_modules/request/aws2.js ./node_modules/pusher/node_modules/request/forever.js ./node_modules/pusher/node_modules/request/LICENSE ./node_modules/pusher/node_modules/request/main.js ./node_modules/pusher/node_modules/request/mimetypes.js ./node_modules/pusher/node_modules/request/oauth.js ./node_modules/pusher/node_modules/request/package.json ./node_modules/pusher/node_modules/request/tunnel.js ./node_modules/pusher/node_modules/request/uuid.js ./node_modules/pusher/node_modules/request/vendor/cookie/index.js ./node_modules/pusher/node_modules/request/vendor/cookie/jar.js ./node_modules/pusher/package.json ./node_modules/request/forever.js ./node_modules/request/LICENSE ./node_modules/request/main.js ./node_modules/request/mimetypes.js ./node_modules/request/oauth.js ./node_modules/request/package.json ./node_modules/request/uuid.js ./node_modules/request/vendor/cookie/index.js ./node_modules/request/vendor/cookie/jar.js ./node_modules/sax/lib/sax.js ./node_modules/sax/LICENSE ./node_modules/sax/package.json ./node_modules/sendgrid/index.js ./node_modules/sendgrid/lib/email.js ./node_modules/sendgrid/lib/file_handler.js ./node_modules/sendgrid/lib/sendgrid.js ./node_modules/sendgrid/lib/smtpapi_headers.js ./node_modules/sendgrid/lib/validation.js ./node_modules/sendgrid/MIT.LICENSE ./node_modules/sendgrid/node_modules/mime/LICENSE ./node_modules/sendgrid/node_modules/mime/mime.js ./node_modules/sendgrid/node_modules/mime/package.json ./node_modules/sendgrid/node_modules/mime/types/mime.types ./node_modules/sendgrid/node_modules/mime/types/node.types ./node_modules/sendgrid/node_modules/nodemailer/lib/engines/sendmail.js ./node_modules/sendgrid/node_modules/nodemailer/lib/engines/ses.js ./node_modules/sendgrid/node_modules/nodemailer/lib/engines/smtp.js ./node_modules/sendgrid/node_modules/nodemailer/lib/engines/stub.js ./node_modules/sendgrid/node_modules/nodemailer/lib/helpers.js ./node_modules/sendgrid/node_modules/nodemailer/lib/nodemailer.js ./node_modules/sendgrid/node_modules/nodemailer/lib/transport.js ./node_modules/sendgrid/node_modules/nodemailer/lib/wellknown.js ./node_modules/sendgrid/node_modules/nodemailer/lib/xoauth.js ./node_modules/sendgrid/node_modules/nodemailer/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/lib/dkim.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/lib/mailcomposer.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/lib/punycode.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/lib/urlfetch.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/content-types.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/index.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/mime-functions.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/package.json ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/package.json ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/index.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/lib/client.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/lib/pool.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/lib/server.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/lib/starttls.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/cert/cert.pem ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/cert/key.pem ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/lib/mockup.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/lib/rai.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/lib/starttls.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/package.json ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/package.json ./node_modules/sendgrid/node_modules/nodemailer/package.json ./node_modules/sendgrid/node_modules/step/lib/step.js ./node_modules/sendgrid/node_modules/step/package.json ./node_modules/sendgrid/node_modules/underscore/index.js ./node_modules/sendgrid/node_modules/underscore/LICENSE ./node_modules/sendgrid/node_modules/underscore/package.json ./node_modules/sendgrid/node_modules/underscore/underscore.js ./node_modules/sendgrid/package.json ./node_modules/sqlserver/lib/sql.js ./node_modules/sqlserver/lib/sqlserver.native.js ./node_modules/sqlserver/lib/sqlserver.node ./node_modules/sqlserver/package.json ./node_modules/tripwire/lib/native/windows/x86/tripwire.node ./node_modules/tripwire/lib/tripwire.js ./node_modules/tripwire/LICENSE.txt ./node_modules/tripwire/package.json ./node_modules/underscore/LICENSE ./node_modules/underscore/package.json ./node_modules/underscore/underscore.js ./node_modules/underscore.string/lib/underscore.string.js ./node_modules/underscore.string/package.json ./node_modules/wns/lib/wns.js ./node_modules/wns/LICENSE.txt ./node_modules/wns/package.json ./node_modules/xml2js/lib/xml2js.js ./node_modules/xml2js/LICENSE ./node_modules/xml2js/node_modules/sax/lib/sax.js ./node_modules/xml2js/node_modules/sax/LICENSE ./node_modules/xml2js/node_modules/sax/package.json ./node_modules/xml2js/package.json ./node_modules/xmlbuilder/lib/index.js ./node_modules/xmlbuilder/lib/XMLBuilder.js ./node_modules/xmlbuilder/lib/XMLFragment.js ./node_modules/xmlbuilder/package.json ./runtime/core.js ./runtime/filehelpers.js ./runtime/jsonwebtoken.js ./runtime/logger.js ./runtime/logwriter.js ./runtime/metrics.js ./runtime/query/expressions.js ./runtime/query/expressionvisitor.js ./runtime/query/queryparser.js ./runtime/request/authentication/facebook.js ./runtime/request/authentication/google.js ./runtime/request/authentication/microsoftaccount.js ./runtime/request/authentication/twitter.js ./runtime/request/dataoperation.js ./runtime/request/datapipeline.js ./runtime/request/html/corshelper.js ./runtime/request/html/crossdomainhandler.js ./runtime/request/html/templates/crossdomainbridge.html ./runtime/request/html/templates/loginviaiframe.html ./runtime/request/html/templates/loginviaiframereceiver.html ./runtime/request/html/templates/loginviapostmessage.html ./runtime/request/html/templating.js ./runtime/request/loginhandler.js ./runtime/request/middleware/allowHandler.js ./runtime/request/middleware/authenticate.js ./runtime/request/middleware/authorize.js ./runtime/request/middleware/bodyParser.js ./runtime/request/middleware/errorHandler.js ./runtime/request/middleware/requestLimit.js ./runtime/request/request.js ./runtime/request/requesthandler.js ./runtime/request/schedulerhandler.js ./runtime/request/statushandler.js ./runtime/request/tablehandler.js ./runtime/resources.js ./runtime/script/apibuilder.js ./runtime/script/metadata.js ./runtime/script/push/notify-apns.js ./runtime/script/push/notify-gcm.js ./runtime/script/push/notify-mpns.js ./runtime/script/push/notify-wns.js ./runtime/script/push/notify.js ./runtime/script/scriptcache.js ./runtime/script/scripterror.js ./runtime/script/scriptloader.js ./runtime/script/scriptmanager.js ./runtime/script/scriptstate.js ./runtime/script/sqladapter.js ./runtime/script/table.js ./runtime/server.js ./runtime/statuscodes.js ./runtime/storage/sqlbooleanizer.js ./runtime/storage/sqlformatter.js ./runtime/storage/sqlhelpers.js ./runtime/storage/storage.js ./runtime/Zumo.Node.js ./static/client/MobileServices.Web-1.0.0.js ./static/client/MobileServices.Web-1.0.0.min.js ./static/default.htm ./static/robots.txt ./Web.config

    Read the article

  • AngularJS on top of ASP.NET: Moving the MVC framework out to the browser

    - by Varun Chatterji
    Heavily drawing inspiration from Ruby on Rails, MVC4’s convention over configuration model of development soon became the Holy Grail of .NET web development. The MVC model brought with it the goodness of proper separation of concerns between business logic, data, and the presentation logic. However, the MVC paradigm, was still one in which server side .NET code could be mixed with presentation code. The Razor templating engine, though cleaner than its predecessors, still encouraged and allowed you to mix .NET server side code with presentation logic. Thus, for example, if the developer required a certain <div> tag to be shown if a particular variable ShowDiv was true in the View’s model, the code could look like the following: Fig 1: To show a div or not. Server side .NET code is used in the View Mixing .NET code with HTML in views can soon get very messy. Wouldn’t it be nice if the presentation layer (HTML) could be pure HTML? Also, in the ASP.NET MVC model, some of the business logic invariably resides in the controller. It is tempting to use an anti­pattern like the one shown above to control whether a div should be shown or not. However, best practice would indicate that the Controller should not be aware of the div. The ShowDiv variable in the model should not exist. A controller should ideally, only be used to do the plumbing of getting the data populated in the model and nothing else. The view (ideally pure HTML) should render the presentation layer based on the model. In this article we will see how Angular JS, a new JavaScript framework by Google can be used effectively to build web applications where: 1. Views are pure HTML 2. Controllers (in the server sense) are pure REST based API calls 3. The presentation layer is loaded as needed from partial HTML only files. What is MVVM? MVVM short for Model View View Model is a new paradigm in web development. In this paradigm, the Model and View stuff exists on the client side through javascript instead of being processed on the server through postbacks. These frameworks are JavaScript frameworks that facilitate the clear separation of the “frontend” or the data rendering logic from the “backend” which is typically just a REST based API that loads and processes data through a resource model. The frameworks are called MVVM as a change to the Model (through javascript) gets reflected in the view immediately i.e. Model > View. Also, a change on the view (through manual input) gets reflected in the model immediately i.e. View > Model. The following figure shows this conceptually (comments are shown in red): Fig 2: Demonstration of MVVM in action In Fig 2, two text boxes are bound to the same variable model.myInt. Thus, changing the view manually (changing one text box through keyboard input) also changes the other textbox in real time demonstrating V > M property of a MVVM framework. Furthermore, clicking the button adds 1 to the value of model.myInt thus changing the model through JavaScript. This immediately updates the view (the value in the two textboxes) thus demonstrating the M > V property of a MVVM framework. Thus we see that the model in a MVVM JavaScript framework can be regarded as “the single source of truth“. This is an important concept. Angular is one such MVVM framework. We shall use it to build a simple app that sends SMS messages to a particular number. Application, Routes, Views, Controllers, Scope and Models Angular can be used in many ways to construct web applications. For this article, we shall only focus on building Single Page Applications (SPAs). Many of the approaches we will follow in this article have alternatives. It is beyond the scope of this article to explain every nuance in detail but we shall try to touch upon the basic concepts and end up with a working application that can be used to send SMS messages using Sent.ly Plus (a service that is itself built using Angular). Before you read on, we would like to urge you to forget what you know about Models, Views, Controllers and Routes in the ASP.NET MVC4 framework. All these words have different meanings in the Angular world. Whenever these words are used in this article, they will refer to Angular concepts and not ASP.NET MVC4 concepts. The following figure shows the skeleton of the root page of an SPA: Fig 3: The skeleton of a SPA The skeleton of the application is based on the Bootstrap starter template which can be found at: http://getbootstrap.com/examples/starter­template/ Apart from loading the Angular, jQuery and Bootstrap JavaScript libraries, it also loads our custom scripts /app/js/controllers.js /app/js/app.js These scripts define the routes, views and controllers which we shall come to in a moment. Application Notice that the body tag (Fig. 3) has an extra attribute: ng­app=”smsApp” Providing this tag “bootstraps” our single page application. It tells Angular to load a “module” called smsApp. This “module” is defined /app/js/app.js angular.module('smsApp', ['smsApp.controllers', function () {}]) Fig 4: The definition of our application module The line shows above, declares a module called smsApp. It also declares that this module “depends” on another module called “smsApp.controllers”. The smsApp.controllers module will contain all the controllers for our SPA. Routing and Views Notice that in the Navbar (in Fig 3) we have included two hyperlinks to: “#/app” “#/help” This is how Angular handles routing. Since the URLs start with “#”, they are actually just bookmarks (and not server side resources). However, our route definition (in /app/js/app.js) gives these URLs a special meaning within the Angular framework. angular.module('smsApp', ['smsApp.controllers', function () { }]) //Configure the routes .config(['$routeProvider', function ($routeProvider) { $routeProvider.when('/binding', { templateUrl: '/app/partials/bindingexample.html', controller: 'BindingController' }); }]); Fig 5: The definition of a route with an associated partial view and controller As we can see from the previous code sample, we are using the $routeProvider object in the configuration of our smsApp module. Notice how the code “asks for” the $routeProvider object by specifying it as a dependency in the [] braces and then defining a function that accepts it as a parameter. This is known as dependency injection. Please refer to the following link if you want to delve into this topic: http://docs.angularjs.org/guide/di What the above code snippet is doing is that it is telling Angular that when the URL is “#/binding”, then it should load the HTML snippet (“partial view”) found at /app/partials/bindingexample.html. Also, for this URL, Angular should load the controller called “BindingController”. We have also marked the div with the class “container” (in Fig 3) with the ng­view attribute. This attribute tells Angular that views (partial HTML pages) defined in the routes will be loaded within this div. You can see that the Angular JavaScript framework, unlike many other frameworks, works purely by extending HTML tags and attributes. It also allows you to extend HTML with your own tags and attributes (through directives) if you so desire, you can find out more about directives at the following URL: http://www.codeproject.com/Articles/607873/Extending­HTML­with­AngularJS­Directives Controllers and Models We have seen how we define what views and controllers should be loaded for a particular route. Let us now consider how controllers are defined. Our controllers are defined in the file /app/js/controllers.js. The following snippet shows the definition of the “BindingController” which is loaded when we hit the URL http://localhost:port/index.html#/binding (as we have defined in the route earlier as shown in Fig 5). Remember that we had defined that our application module “smsApp” depends on the “smsApp.controllers” module (see Fig 4). The code snippet below shows how the “BindingController” defined in the route shown in Fig 5 is defined in the module smsApp.controllers: angular.module('smsApp.controllers', [function () { }]) .controller('BindingController', ['$scope', function ($scope) { $scope.model = {}; $scope.model.myInt = 6; $scope.addOne = function () { $scope.model.myInt++; } }]); Fig 6: The definition of a controller in the “smsApp.controllers” module. The pieces are falling in place! Remember Fig.2? That was the code of a partial view that was loaded within the container div of the skeleton SPA shown in Fig 3. The route definition shown in Fig 5 also defined that the controller called “BindingController” (shown in Fig 6.) was loaded when we loaded the URL: http://localhost:22544/index.html#/binding The button in Fig 2 was marked with the attribute ng­click=”addOne()” which added 1 to the value of model.myInt. In Fig 6, we can see that this function is actually defined in the “BindingController”. Scope We can see from Fig 6, that in the definition of “BindingController”, we defined a dependency on $scope and then, as usual, defined a function which “asks for” $scope as per the dependency injection pattern. So what is $scope? Any guesses? As you might have guessed a scope is a particular “address space” where variables and functions may be defined. This has a similar meaning to scope in a programming language like C#. Model: The Scope is not the Model It is tempting to assign variables in the scope directly. For example, we could have defined myInt as $scope.myInt = 6 in Fig 6 instead of $scope.model.myInt = 6. The reason why this is a bad idea is that scope in hierarchical in Angular. Thus if we were to define a controller which was defined within the another controller (nested controllers), then the inner controller would inherit the scope of the parent controller. This inheritance would follow JavaScript prototypal inheritance. Let’s say the parent controller defined a variable through $scope.myInt = 6. The child controller would inherit the scope through java prototypical inheritance. This basically means that the child scope has a variable myInt that points to the parent scopes myInt variable. Now if we assigned the value of myInt in the parent, the child scope would be updated with the same value as the child scope’s myInt variable points to the parent scope’s myInt variable. However, if we were to assign the value of the myInt variable in the child scope, then the link of that variable to the parent scope would be broken as the variable myInt in the child scope now points to the value 6 and not to the parent scope’s myInt variable. But, if we defined a variable model in the parent scope, then the child scope will also have a variable model that points to the model variable in the parent scope. Updating the value of $scope.model.myInt in the parent scope would change the model variable in the child scope too as the variable is pointed to the model variable in the parent scope. Now changing the value of $scope.model.myInt in the child scope would ALSO change the value in the parent scope. This is because the model reference in the child scope is pointed to the scope variable in the parent. We did no new assignment to the model variable in the child scope. We only changed an attribute of the model variable. Since the model variable (in the child scope) points to the model variable in the parent scope, we have successfully changed the value of myInt in the parent scope. Thus the value of $scope.model.myInt in the parent scope becomes the “single source of truth“. This is a tricky concept, thus it is considered good practice to NOT use scope inheritance. More info on prototypal inheritance in Angular can be found in the “JavaScript Prototypal Inheritance” section at the following URL: https://github.com/angular/angular.js/wiki/Understanding­Scopes. Building It: An Angular JS application using a .NET Web API Backend Now that we have a perspective on the basic components of an MVVM application built using Angular, let’s build something useful. We will build an application that can be used to send out SMS messages to a given phone number. The following diagram describes the architecture of the application we are going to build: Fig 7: Broad application architecture We are going to add an HTML Partial to our project. This partial will contain the form fields that will accept the phone number and message that needs to be sent as an SMS. It will also display all the messages that have previously been sent. All the executable code that is run on the occurrence of events (button clicks etc.) in the view resides in the controller. The controller interacts with the ASP.NET WebAPI to get a history of SMS messages, add a message etc. through a REST based API. For the purposes of simplicity, we will use an in memory data structure for the purposes of creating this application. Thus, the tasks ahead of us are: Creating the REST WebApi with GET, PUT, POST, DELETE methods. Creating the SmsView.html partial Creating the SmsController controller with methods that are called from the SmsView.html partial Add a new route that loads the controller and the partial. 1. Creating the REST WebAPI This is a simple task that should be quite straightforward to any .NET developer. The following listing shows our ApiController: public class SmsMessage { public string to { get; set; } public string message { get; set; } } public class SmsResource : SmsMessage { public int smsId { get; set; } } public class SmsResourceController : ApiController { public static Dictionary<int, SmsResource> messages = new Dictionary<int, SmsResource>(); public static int currentId = 0; // GET api/<controller> public List<SmsResource> Get() { List<SmsResource> result = new List<SmsResource>(); foreach (int key in messages.Keys) { result.Add(messages[key]); } return result; } // GET api/<controller>/5 public SmsResource Get(int id) { if (messages.ContainsKey(id)) return messages[id]; return null; } // POST api/<controller> public List<SmsResource> Post([FromBody] SmsMessage value) { //Synchronize on messages so we don't have id collisions lock (messages) { SmsResource res = (SmsResource) value; res.smsId = currentId++; messages.Add(res.smsId, res); //SentlyPlusSmsSender.SendMessage(value.to, value.message); return Get(); } } // PUT api/<controller>/5 public List<SmsResource> Put(int id, [FromBody] SmsMessage value) { //Synchronize on messages so we don't have id collisions lock (messages) { if (messages.ContainsKey(id)) { //Update the message messages[id].message = value.message; messages[id].to = value.message; } return Get(); } } // DELETE api/<controller>/5 public List<SmsResource> Delete(int id) { if (messages.ContainsKey(id)) { messages.Remove(id); } return Get(); } } Once this class is defined, we should be able to access the WebAPI by a simple GET request using the browser: http://localhost:port/api/SmsResource Notice the commented line: //SentlyPlusSmsSender.SendMessage The SentlyPlusSmsSender class is defined in the attached solution. We have shown this line as commented as we want to explain the core Angular concepts. If you load the attached solution, this line is uncommented in the source and an actual SMS will be sent! By default, the API returns XML. For consumption of the API in Angular, we would like it to return JSON. To change the default to JSON, we make the following change to WebApiConfig.cs file located in the App_Start folder. public static class WebApiConfig { public static void Register(HttpConfiguration config) { config.Routes.MapHttpRoute( name: "DefaultApi", routeTemplate: "api/{controller}/{id}", defaults: new { id = RouteParameter.Optional } ); var appXmlType = config.Formatters.XmlFormatter. SupportedMediaTypes. FirstOrDefault( t => t.MediaType == "application/xml"); config.Formatters.XmlFormatter.SupportedMediaTypes.Remove(appXmlType); } } We now have our backend REST Api which we can consume from Angular! 2. Creating the SmsView.html partial This simple partial will define two fields: the destination phone number (international format starting with a +) and the message. These fields will be bound to model.phoneNumber and model.message. We will also add a button that we shall hook up to sendMessage() in the controller. A list of all previously sent messages (bound to model.allMessages) will also be displayed below the form input. The following code shows the code for the partial: <!--­­ If model.errorMessage is defined, then render the error div -­­> <div class="alert alert-­danger alert-­dismissable" style="margin­-top: 30px;" ng­-show="model.errorMessage != undefined"> <button type="button" class="close" data­dismiss="alert" aria­hidden="true">&times;</button> <strong>Error!</strong> <br /> {{ model.errorMessage }} </div> <!--­­ The input fields bound to the model --­­> <div class="well" style="margin-­top: 30px;"> <table style="width: 100%;"> <tr> <td style="width: 45%; text-­align: center;"> <input type="text" placeholder="Phone number (eg; +44 7778 609466)" ng­-model="model.phoneNumber" class="form-­control" style="width: 90%" onkeypress="return checkPhoneInput();" /> </td> <td style="width: 45%; text-­align: center;"> <input type="text" placeholder="Message" ng­-model="model.message" class="form-­control" style="width: 90%" /> </td> <td style="text-­align: center;"> <button class="btn btn-­danger" ng-­click="sendMessage();" ng-­disabled="model.isAjaxInProgress" style="margin­right: 5px;">Send</button> <img src="/Content/ajax-­loader.gif" ng­-show="model.isAjaxInProgress" /> </td> </tr> </table> </div> <!--­­ The past messages ­­--> <div style="margin-­top: 30px;"> <!­­-- The following div is shown if there are no past messages --­­> <div ng­-show="model.allMessages.length == 0"> No messages have been sent yet! </div> <!--­­ The following div is shown if there are some past messages --­­> <div ng-­show="model.allMessages.length == 0"> <table style="width: 100%;" class="table table-­striped"> <tr> <td>Phone Number</td> <td>Message</td> <td></td> </tr> <!--­­ The ng-­repeat directive is line the repeater control in .NET, but as you can see this partial is pure HTML which is much cleaner --> <tr ng-­repeat="message in model.allMessages"> <td>{{ message.to }}</td> <td>{{ message.message }}</td> <td> <button class="btn btn-­danger" ng-­click="delete(message.smsId);" ng­-disabled="model.isAjaxInProgress">Delete</button> </td> </tr> </table> </div> </div> The above code is commented and should be self explanatory. Conditional rendering is achieved through using the ng-­show=”condition” attribute on various div tags. Input fields are bound to the model and the send button is bound to the sendMessage() function in the controller as through the ng­click=”sendMessage()” attribute defined on the button tag. While AJAX calls are taking place, the controller sets model.isAjaxInProgress to true. Based on this variable, buttons are disabled through the ng-­disabled directive which is added as an attribute to the buttons. The ng-­repeat directive added as an attribute to the tr tag causes the table row to be rendered multiple times much like an ASP.NET repeater. 3. Creating the SmsController controller The penultimate piece of our application is the controller which responds to events from our view and interacts with our MVC4 REST WebAPI. The following listing shows the code we need to add to /app/js/controllers.js. Note that controller definitions can be chained. Also note that this controller “asks for” the $http service. The $http service is a simple way in Angular to do AJAX. So far we have only encountered modules, controllers, views and directives in Angular. The $http is new entity in Angular called a service. More information on Angular services can be found at the following URL: http://docs.angularjs.org/guide/dev_guide.services.understanding_services. .controller('SmsController', ['$scope', '$http', function ($scope, $http) { //We define the model $scope.model = {}; //We define the allMessages array in the model //that will contain all the messages sent so far $scope.model.allMessages = []; //The error if any $scope.model.errorMessage = undefined; //We initially load data so set the isAjaxInProgress = true; $scope.model.isAjaxInProgress = true; //Load all the messages $http({ url: '/api/smsresource', method: "GET" }). success(function (data, status, headers, config) { this callback will be called asynchronously //when the response is available $scope.model.allMessages = data; //We are done with AJAX loading $scope.model.isAjaxInProgress = false; }). error(function (data, status, headers, config) { //called asynchronously if an error occurs //or server returns response with an error status. $scope.model.errorMessage = "Error occurred status:" + status; //We are done with AJAX loading $scope.model.isAjaxInProgress = false; }); $scope.delete = function (id) { //We are making an ajax call so we set this to true $scope.model.isAjaxInProgress = true; $http({ url: '/api/smsresource/' + id, method: "DELETE" }). success(function (data, status, headers, config) { // this callback will be called asynchronously // when the response is available $scope.model.allMessages = data; //We are done with AJAX loading $scope.model.isAjaxInProgress = false; }); error(function (data, status, headers, config) { // called asynchronously if an error occurs // or server returns response with an error status. $scope.model.errorMessage = "Error occurred status:" + status; //We are done with AJAX loading $scope.model.isAjaxInProgress = false; }); } $scope.sendMessage = function () { $scope.model.errorMessage = undefined; var message = ''; if($scope.model.message != undefined) message = $scope.model.message.trim(); if ($scope.model.phoneNumber == undefined || $scope.model.phoneNumber == '' || $scope.model.phoneNumber.length < 10 || $scope.model.phoneNumber[0] != '+') { $scope.model.errorMessage = "You must enter a valid phone number in international format. Eg: +44 7778 609466"; return; } if (message.length == 0) { $scope.model.errorMessage = "You must specify a message!"; return; } //We are making an ajax call so we set this to true $scope.model.isAjaxInProgress = true; $http({ url: '/api/smsresource', method: "POST", data: { to: $scope.model.phoneNumber, message: $scope.model.message } }). success(function (data, status, headers, config) { // this callback will be called asynchronously // when the response is available $scope.model.allMessages = data; //We are done with AJAX loading $scope.model.isAjaxInProgress = false; }). error(function (data, status, headers, config) { // called asynchronously if an error occurs // or server returns response with an error status. $scope.model.errorMessage = "Error occurred status:" + status // We are done with AJAX loading $scope.model.isAjaxInProgress = false; }); } }]); We can see from the previous listing how the functions that are called from the view are defined in the controller. It should also be evident how easy it is to make AJAX calls to consume our MVC4 REST WebAPI. Now we are left with the final piece. We need to define a route that associates a particular path with the view we have defined and the controller we have defined. 4. Add a new route that loads the controller and the partial This is the easiest part of the puzzle. We simply define another route in the /app/js/app.js file: $routeProvider.when('/sms', { templateUrl: '/app/partials/smsview.html', controller: 'SmsController' }); Conclusion In this article we have seen how much of the server side functionality in the MVC4 framework can be moved to the browser thus delivering a snappy and fast user interface. We have seen how we can build client side HTML only views that avoid the messy syntax offered by server side Razor views. We have built a functioning app from the ground up. The significant advantage of this approach to building web apps is that the front end can be completely platform independent. Even though we used ASP.NET to create our REST API, we could just easily have used any other language such as Node.js, Ruby etc without changing a single line of our front end code. Angular is a rich framework and we have only touched on basic functionality required to create a SPA. For readers who wish to delve further into the Angular framework, we would recommend the following URL as a starting point: http://docs.angularjs.org/misc/started. To get started with the code for this project: Sign up for an account at http://plus.sent.ly (free) Add your phone number Go to the “My Identies Page” Note Down your Sender ID, Consumer Key and Consumer Secret Download the code for this article at: https://docs.google.com/file/d/0BzjEWqSE31yoZjZlV0d0R2Y3eW8/edit?usp=sharing Change the values of Sender Id, Consumer Key and Consumer Secret in the web.config file Run the project through Visual Studio!

    Read the article

  • Windows Phone 7: Building a simple dictionary web client

    - by TechTwaddle
    Like I mentioned in this post a while back, I came across a dictionary web service called Aonaware that serves up word definitions from various dictionaries and is really easy to use. The services page on their website, http://services.aonaware.com/DictService/DictService.asmx, lists all the operations that are supported by the dictionary service. Here they are, Word Dictionary Web Service The following operations are supported. For a formal definition, please review the Service Description. Define Define given word, returning definitions from all dictionaries DefineInDict Define given word, returning definitions from specified dictionary DictionaryInfo Show information about the specified dictionary DictionaryList Returns a list of available dictionaries DictionaryListExtended Returns a list of advanced dictionaries (e.g. translating dictionaries) Match Look for matching words in all dictionaries using the given strategy MatchInDict Look for matching words in the specified dictionary using the given strategy ServerInfo Show remote server information StrategyList Return list of all available strategies on the server Follow the links above to get more information on each API. In this post we will be building a simple windows phone 7 client which uses this service to get word definitions for words entered by the user. The application will also allow the user to select a dictionary from all the available ones and look up the word definition in that dictionary. So of all the apis above we will be using only two, DictionaryList() to get a list of all supported dictionaries and DefineInDict() to get the word definition from a particular dictionary. Before we get started, a note to you all; I would have liked to implement this application using concepts from data binding, item templates, data templates etc. I have a basic understanding of what they are but, being a beginner, I am not very comfortable with those topics yet so I didn’t use them. I thought I’ll get this version out of the way and maybe in the next version I could give those a try. A somewhat scary mock-up of the what the final application will look like, Select Dictionary is a list picker control from the silverlight toolkit (you need to download and install the toolkit if you haven’t already). Below it is a textbox where the user can enter words to look up and a button beside it to fetch the word definition when clicked. Finally we have a textblock which occupies the remaining area and displays the word definition from the selected dictionary. Create a silverlight application for windows phone 7, AonawareDictionaryClient, and add references to the silverlight toolkit and the web service. From the solution explorer right on References and select Microsoft.Phone.Controls.Toolkit from under the .NET tab, Next, add a reference to the web service. Again right click on References and this time select Add Service Reference In the resulting dialog paste the service url in the Address field and press go, (url –> http://services.aonaware.com/DictService/DictService.asmx) once the service is discovered, provide a name for the NameSpace, in this case I’ve called it AonawareDictionaryService. Press OK. You can now use the classes and functions that are generated in the AonawareDictionaryClient.AonawareDictionaryService namespace. Let’s get the UI done now. In MainPage.xaml add a namespace declaration to use the toolkit controls, xmlns:toolkit="clr-namespace:Microsoft.Phone.Controls;assembly=Microsoft.Phone.Controls.Toolkit" the content of LayoutRoot is changed as follows, (sorry, no syntax highlighting in this post) <StackPanel x:Name="TitlePanel" Grid.Row="0" Margin="12,5,0,5">     <TextBlock x:Name="ApplicationTitle" Text="AONAWARE DICTIONARY CLIENT" Style="{StaticResource PhoneTextNormalStyle}"/>     <!--<TextBlock x:Name="PageTitle" Text="page name" Margin="9,-7,0,0" Style="{StaticResource PhoneTextTitle1Style}"/>--> </StackPanel> <!--ContentPanel - place additional content here--> <Grid x:Name="ContentPanel" Grid.Row="1" Margin="12,0,12,0">     <Grid.RowDefinitions>         <RowDefinition Height="Auto"/>         <RowDefinition Height="Auto"/>         <RowDefinition Height="*"/>     </Grid.RowDefinitions>     <toolkit:ListPicker Grid.Column="1" x:Name="listPickerDictionaryList"                         Header="Select Dictionary :">     </toolkit:ListPicker>     <Grid Grid.Row="1" Margin="0,5,0,0">         <Grid.ColumnDefinitions>             <ColumnDefinition Width="*"/>             <ColumnDefinition Width="Auto" />         </Grid.ColumnDefinitions>         <TextBox x:Name="txtboxInputWord" Grid.Column="0" GotFocus="OnTextboxInputWordGotFocus" />         <Button x:Name="btnGo" Grid.Column="1" Click="OnButtonGoClick" >             <Button.Content>                 <Image Source="/images/button-go.png"/>             </Button.Content>         </Button>     </Grid>     <ScrollViewer Grid.Row="2" x:Name="scrollViewer">         <TextBlock  Margin="12,5,12,5"  x:Name="txtBlockWordMeaning" HorizontalAlignment="Stretch"                    VerticalAlignment="Stretch" TextWrapping="Wrap"                    FontSize="26" />     </ScrollViewer> </Grid> I have commented out the PageTitle as it occupies too much valuable space, and the ContentPanel is changed to contain three rows. First row contains the list picker control, second row contains the textbox and the button, and the third row contains a textblock within a scroll viewer. The designer will now be showing the final ui, Now go to MainPage.xaml.cs, and add the following namespace declarations, using Microsoft.Phone.Controls; using AonawareDictionaryClient.AonawareDictionaryService; using System.IO.IsolatedStorage; A class called DictServiceSoapClient would have been created for you in the background when you added a reference to the web service. This class functions as a wrapper to the services exported by the web service. All the web service functions that we saw at the start can be access through this class, or more precisely through an object of this class. Create a data member of type DictServiceSoapClient in the Mainpage class, and a function which initializes it, DictServiceSoapClient DictSvcClient = null; private DictServiceSoapClient GetDictServiceSoapClient() {     if (null == DictSvcClient)     {         DictSvcClient = new DictServiceSoapClient();     }     return DictSvcClient; } We have two major tasks remaining. First, when the application loads we need to populate the list picker with all the supported dictionaries and second, when the user enters a word and clicks on the arrow button we need to fetch the word’s meaning. Populating the List Picker In the OnNavigatingTo event of the MainPage, we call the DictionaryList() api. This can also be done in the OnLoading event handler of the MainPage; not sure if one has an advantage over the other. Here’s the code for OnNavigatedTo, protected override void OnNavigatedTo(System.Windows.Navigation.NavigationEventArgs e) {     DictServiceSoapClient client = GetDictServiceSoapClient();     client.DictionaryListCompleted += new EventHandler<DictionaryListCompletedEventArgs>(OnGetDictionaryListCompleted);     client.DictionaryListAsync();     base.OnNavigatedTo(e); } Windows Phone 7 supports only async calls to web services. When we added a reference to the dictionary service, asynchronous versions of all the functions were generated automatically. So in the above function we register a handler to the DictionaryListCompleted event which will occur when the call to DictionaryList() gets a response from the server. Then we call the DictionaryListAsynch() function which is the async version of the DictionaryList() api. The result of this api will be sent to the handler OnGetDictionaryListCompleted(), void OnGetDictionaryListCompleted(object sender, DictionaryListCompletedEventArgs e) {     IsolatedStorageSettings settings = IsolatedStorageSettings.ApplicationSettings;     Dictionary[] listOfDictionaries;     if (e.Error == null)     {         listOfDictionaries = e.Result;         PopulateListPicker(listOfDictionaries, settings);     }     else if (settings.Contains("SavedDictionaryList"))     {         listOfDictionaries = settings["SavedDictionaryList"] as Dictionary[];         PopulateListPicker(listOfDictionaries, settings);     }     else     {         MessageBoxResult res = MessageBox.Show("An error occured while retrieving dictionary list, do you want to try again?", "Error", MessageBoxButton.OKCancel);         if (MessageBoxResult.OK == res)         {             GetDictServiceSoapClient().DictionaryListAsync();         }     }     settings.Save(); } I have used IsolatedStorageSettings to store a few things; the entire dictionary list and the dictionary that is selected when the user exits the application, so that the next time when the user starts the application the current dictionary is set to the last selected value. First we check if the api returned any error, if the error object is null e.Result will contain the list (actually array) of Dictionary type objects. If there was an error, we check the isolated storage settings to see if there is a dictionary list stored from a previous instance of the application and if so, we populate the list picker based on this saved list. Note that in this case there are chances that the dictionary list might be out of date if there have been changes on the server. Finally, if none of these cases are true, we display an error message to the user and try to fetch the list again. PopulateListPicker() is passed the array of Dictionary objects and the settings object as well, void PopulateListPicker(Dictionary[] listOfDictionaries, IsolatedStorageSettings settings) {     listPickerDictionaryList.Items.Clear();     foreach (Dictionary dictionary in listOfDictionaries)     {         listPickerDictionaryList.Items.Add(dictionary.Name);     }     settings["SavedDictionaryList"] = listOfDictionaries;     string savedDictionaryName;     if (settings.Contains("SavedDictionary"))     {         savedDictionaryName = settings["SavedDictionary"] as string;     }     else     {         savedDictionaryName = "WordNet (r) 2.0"; //default dictionary, wordnet     }     foreach (string dictName in listPickerDictionaryList.Items)     {         if (dictName == savedDictionaryName)         {             listPickerDictionaryList.SelectedItem = dictName;             break;         }     }     settings["SavedDictionary"] = listPickerDictionaryList.SelectedItem as string; } We first clear all the items from the list picker, add the dictionary names from the array and then create a key in the settings called SavedDictionaryList and store the dictionary list in it. We then check if there is saved dictionary available from a previous instance, if there is, we set it as the selected item in the list picker. And if not, we set “WordNet ® 2.0” as the default dictionary. Before returning, we save the selected dictionary in the “SavedDictionary” key of the isolated storage settings. Fetching word definitions Getting this part done is very similar to the above code. We get the input word from the textbox, call into DefineInDictAsync() to fetch the definition and when DefineInDictAsync completes, we get the result and display it in the textblock. Here is the handler for the button click, private void OnButtonGoClick(object sender, RoutedEventArgs e) {     txtBlockWordMeaning.Text = "Please wait..";     IsolatedStorageSettings settings = IsolatedStorageSettings.ApplicationSettings;     if (txtboxInputWord.Text.Trim().Length <= 0)     {         MessageBox.Show("Please enter a word in the textbox and press 'Go'");     }     else     {         Dictionary[] listOfDictionaries = settings["SavedDictionaryList"] as Dictionary[];         string selectedDictionary = listPickerDictionaryList.SelectedItem.ToString();         string dictId = "wn"; //default dictionary is wordnet (wn is the dict id)         foreach (Dictionary dict in listOfDictionaries)         {             if (dict.Name == selectedDictionary)             {                 dictId = dict.Id;                 break;             }         }         DictServiceSoapClient client = GetDictServiceSoapClient();         client.DefineInDictCompleted += new EventHandler<DefineInDictCompletedEventArgs>(OnDefineInDictCompleted);         client.DefineInDictAsync(dictId, txtboxInputWord.Text.Trim());     } } We validate the input and then select the dictionary id based on the currently selected dictionary. We need the dictionary id because the api DefineInDict() expects the dictionary identifier and not the dictionary name. We could very well have stored the dictionary id in isolated storage settings too. Again, same as before, we register a event handler for the DefineInDictCompleted event and call the DefineInDictAsync() method passing in the dictionary id and the input word. void OnDefineInDictCompleted(object sender, DefineInDictCompletedEventArgs e) {     WordDefinition wd = e.Result;     scrollViewer.ScrollToVerticalOffset(0.0f);     if (wd.Definitions.Length == 0)     {         txtBlockWordMeaning.Text = String.Format("No definitions were found for '{0}' in '{1}'", txtboxInputWord.Text.Trim(), listPickerDictionaryList.SelectedItem.ToString().Trim());     }     else     {         foreach (Definition def in wd.Definitions)         {             string str = def.WordDefinition;             str = str.Replace("  ", " "); //some formatting             txtBlockWordMeaning.Text = str;         }     } } When the api completes, e.Result will contain a WordDefnition object. This class is also generated in the background while adding the service reference. We check the word definitions within this class to see if any results were returned, if not, we display a message to the user in the textblock. If a definition was found the text on the textblock is set to display the definition of the word. Adding final touches, we now need to save the current dictionary when the application exits. A small but useful thing is selecting the entire word in the input textbox when the user selects it. This makes sure that if the user has looked up a definition for a really long word, he doesn’t have to press ‘clear’ too many times to enter the next word, protected override void OnNavigatingFrom(System.Windows.Navigation.NavigatingCancelEventArgs e) {     IsolatedStorageSettings settings = IsolatedStorageSettings.ApplicationSettings;     settings["SavedDictionary"] = listPickerDictionaryList.SelectedItem as string;     settings.Save();     base.OnNavigatingFrom(e); } private void OnTextboxInputWordGotFocus(object sender, RoutedEventArgs e) {     TextBox txtbox = sender as TextBox;     if (txtbox.Text.Trim().Length > 0)     {         txtbox.SelectionStart = 0;         txtbox.SelectionLength = txtbox.Text.Length;     } } OnNavigatingFrom() is called whenever you navigate away from the MainPage, since our application contains only one page that would mean that it is exiting. I leave you with a short video of the application in action, but before that if you have any suggestions on how to make the code better and improve it please do leave a comment. Until next time…

    Read the article

  • What does a Software Developer actually do?

    - by chobo2
    Hi I am graduating from my Computer Science degree in a few weeks from now!! I started to look for my first job. For the last couple years I gotten really into web programming(Asp.net). My first choice would be to get a junior asp.net MVC developer but I don't any companies in my area use MVC yet or if they do they are not hiring. So my second choice would be a junior asp.net Webforms developer. My other choices after that would be forms applications, mobile applications using .Net and C#. As you can see I am looking for something with .Net. I spent the last couple years doing .Net projects for school, on my free time and love the Language and it would pain me right now to switch to something like php. So now I found a posting in my area for an Entry Software Developer. I like the fact that they are using .net and that it is entry job(I never worked in this industry and never had more then like a tutoring job so I want to for like intermediate jobs). Posting Are you looking for an exciting challenge within a dynamic, people-oriented culture where you can launch your technical career? Company Name Inc. is a technology consulting company, located in Canada, that designs, develops, and delivers real-time interactive applications accessed via the Internet as well as back-end tools to support these applications. Company Name provides a combination of out-of-the-box and customized solutions to an expanding list of partners and customers. POSITION SUMMARY As a member of our team, the successful candidate will be responsible for helping us increase the quality and stability of our software systems by working jointly and directly with both the Software Development teams and the QA Team. The primary mission of this role will be to substantially enhance our test automation suite. The incumbent will design and program automated tests (unit, integration, system, stress and load) in Visual Studio using C# and will develop sound processes that help us identify and resolve defects as early as possible. The successful incumbent will help us improve and enhance system functionality, reliability, performance and scalability. This role is specifically designed for an eager, bright, new graduate who is looking for a stepping stone into a software engineering role. We promote from within and invite new graduates to apply for this important position - which may lead to new opportunities. We also offer a generous professional development plan to help you on your way. You will be a key part of a team of experts that is responsible for improving the quality of our software by: • Designing, writing, and executing test plans and programmatic tests in Visual Studio using C# and NUnit for functional testing of our code, new features, regression, and performance test procedures. • Working with the engineers to design and build the stress and load testing framework which emulates tens and even hundreds of thousands of concurrent users via a distributed network interfacing with our Load Testing Lab. • Interfacing with both the Development Team and the QA Team to ensure risks are identified and managed. • Mentoring and leading the QA Team in programmatic test automation technologies and tools. MUST HAVE SKILLS / QUALIFICATIONS: • Diploma or higher Degree in Computer Science, or equivalent formal training. • Fundamental C# programming skills. • Knowledge of Internet technologies and Microsoft Windows platforms. • Knowledge of PC hardware. • Excellent communication skills (both oral and written). • Self-starter who takes initiative, requires minimal supervision, can handle multiple simultaneous tasks. • Detail-oriented, able to concentrate, and work quickly. • Proven diagnostic, analytical, and problem solving skills. NICE TO HAVE SKILLS: • Exposure to Visual Studio Team System or Visual Studio Test Edition. • Exposure in C# using NUnit. • Exposure to NUnit, HTTPUnit, and other automation tool suites. • Exposure to Performance/Stress/Load Testing. • Good understanding of relational databases (MS SQL Server). • Familiar with video and online multi-player games. As part of our team you will have the opportunity to work with a supportive team of experts, drive your own success, and ride the wave as we continually expand our team of experts. If you are interested in this opportunity, please send your resume to [email protected] with “Entry Level Software Developer” in the subject line. So that is the posting. To me it sounds like it is QA job. I don't have anything against QA jobs but alot of them seems to be your just clicking buttons and running scripts. Is this what a typical software developer does? Like I am so on the fence to apply for this job. On one side I am not sure how much programming I would be doing. Like I want to be at least half the time programming otherwise my skills will never improve since I will never be programming in teams and stuff. At the same time I have no experience in the industry so on the other side I am thinking just go for it and then maybe a year later try to get a full programming job(provided that I got the job). Yet if I am not programming in that job then that experience will not help me for the next job I find as I will be back a square one.

    Read the article

  • VS 2012 Code Review &ndash; Before Check In OR After Check In?

    - by Tarun Arora
    “Is Code Review Important and Effective?” There is a consensus across the industry that code review is an effective and practical way to collar code inconsistency and possible defects early in the software development life cycle. Among others some of the advantages of code reviews are, Bugs are found faster Forces developers to write readable code (code that can be read without explanation or introduction!) Optimization methods/tricks/productive programs spread faster Programmers as specialists "evolve" faster It's fun “Code review is systematic examination (often known as peer review) of computer source code. It is intended to find and fix mistakes overlooked in the initial development phase, improving both the overall quality of software and the developers' skills. Reviews are done in various forms such as pair programming, informal walkthroughs, and formal inspections.” Wikipedia No where does the definition mention whether its better to review code before the code has been committed to version control or after the commit has been performed. No matter which side you favour, Visual Studio 2012 allows you to request for a code review both before check in and also request for a review after check in. Let’s weigh the pros and cons of the approaches independently. Code Review Before Check In or Code Review After Check In? Approach 1 – Code Review before Check in Developer completes the code and feels the code quality is appropriate for check in to TFS. The developer raises a code review request to have a second pair of eyes validate if the code abides to the recommended best practices, will not result in any defects due to common coding mistakes and whether any optimizations can be made to improve the code quality.                                             Image 1 – code review before check in Pros Everything that gets committed to source control is reviewed. Minimizes the chances of smelly code making its way into the code base. Decreases the cost of fixing bugs, remember, the earlier you find them, the lesser the pain in fixing them. Cons Development Code Freeze – Since the changes aren’t in the source control yet. Further development can only be done off-line. The changes have not been through a CI build, hard to say whether the code abides to all build quality standards. Inconsistent! Cumbersome to track the actual code review process.  Not every change to the code base is worth reviewing, a lot of effort is invested for very little gain. Approach 2 – Code Review after Check in Developer checks in, random code reviews are performed on the checked in code.                                                      Image 2 – Code review after check in Pros The code has already passed the CI build and run through any code analysis plug ins you may have running on the build server. Instruct the developer to ensure ZERO fx cop, style cop and static code analysis before check in. Code is cleaner and smell free even before the code review. No Offline development, developers can continue to develop against the source control. Cons Bad code can easily make its way into the code base. Since the review take place much later in the cycle, the cost of fixing issues can prove to be much higher. Approach 3 – Hybrid Approach The community advocates a more hybrid approach, a blend of tooling and human accountability quotient.                                                               Image 3 – Hybrid Approach 1. Code review high impact check ins. It is not possible to review everything, by setting up code review check in policies you can end up slowing your team. More over, the code that you are reviewing before check in hasn't even been through a green CI build either. 2. Tooling. Let the tooling work for you. By running static analysis, fx cop, style cop and other plug ins on the build agent, you can identify the real issues that in my opinion can't possibly be identified using human reviews. Configure the tooling to report back top 10 issues every day. Mandate the manual code review of individuals who keep making it to this list of shame more often. 3. During Merge. I would prefer eliminating some of the other code issues during merge from Main branch to the release branch. In a scrum project this is still easier because cheery picking the merges is a possibility and the size of code being reviewed is still limited. Let the tooling work for you, if some one breaks the CI build often, put them on a gated check in build course until you see improvement. If some one appears on the top 10 list of shame generated via the build then ensure that all their code is reviewed till you see improvement. At the end of the day, the goal is to ensure that the code being delivered is top quality. By enforcing a code review before any check in, you force the developer to work offline or stay put till the review is complete. What do the experts say? So I asked a few expects what they thought of “Code Review quality gate before Checking in code?" Terje Sandstrom | Microsoft ALM MVP You mean a review quality gate BEFORE checking in code????? That would mean a lot of code staying either local or in shelvesets, and not even been through a CI build, and a green CI build being the main criteria for going further, f.e. to the review state. I would not like code laying around with no checkin’s. Having a requirement that code is checked in small pieces, 4-8 hours work max, and AT LEAST daily checkins, a manual code review comes second down the lane. I would expect review quality gates to happen before merging back to main, or before merging to release.  But that would all be on checked-in code.  Branching is absolutely one way to ease the pain.   Another way we are using is automatic quality builds, running metrics, coverage, static code analysis.  Unfortunately it takes some time, would be great to be on CI’s – but…., so it’s done scheduled every night. Based on this we get, among other stuff,  top 10 lists of suspicious code, which is then subjected to reviews.  If a person seems to be very popular on these top 10 lists, we subject every check in from that person to a review for a period. That normally helps.   None of the clients I have can afford to have every checkin reviewed, so we need to find ways around it. I don’t disagree with the nicety of having all the code reviewed, but I find it hard to find those resources in today’s enterprises. David V. Corbin | Visual Studio ALM Ranger I tend to agree with both sides. I hate having code that is not checked in, but at the same time hate having “bad” code in the repository. I have found that branching is one approach to solving this dilemma. Code is checked into the private/feature branch before the review, but is not merged over to the “official” branch until after the review. I advocate both, depending on circumstance (especially team dynamics)   - The “pre-checkin” is usually for elements that may impact the project as a whole. Think of it as another “gate” along with passing unit tests. - The “post-checkin” may very well not be at the changeset level, but correlates to a review at the “user story” level.   Again, this depends on team dynamics in play…. Robert MacLean | Microsoft ALM MVP I do not think there is no right answer for the industry as a whole. In short the question is why do you do reviews? Your question implies risk mitigation, so in low risk areas you can get away with it after check in while in high risk you need to do it before check in. An example is those new to a team or juniors need it much earlier (maybe that is before checkin, maybe that is soon after) than seniors who have shipped twenty sprints on the team. Abhimanyu Singhal | Visual Studio ALM Ranger Depends on per scenario basis. We recommend post check-in reviews when: 1. We don't want to block other checks and processes on manual code reviews. Manual reviews take time, and some pieces may not require manual reviews at all. 2. We need to trace all changes and track history. 3. We have a code promotion strategy/process in place. For risk mitigation, post checkin code can be promoted to Accepted branches. Or can be rejected. Pre Checkin Reviews are used when 1. There is a high risk factor associated 2. Reviewers are generally (most of times) have immediate availability. 3. Team does not have strict tracking needs. Simply speaking, no single process fits all scenarios. You need to select what works best for your team/project. Thomas Schissler | Visual Studio ALM Ranger This is an interesting discussion, I’m right now discussing details about executing code reviews with my teams. I see and understand the aspects you brought in, but there is another side as well, I’d like to point out. 1.) If you do reviews per check in this is not very practical as a hard rule because this will disturb the flow of the team very often or it will lead to reduce the checkin frequency of the devs which I would not accept. 2.) If you do later reviews, for example if you review PBIs, it is not easy to find out which code you should review. Either you review all changesets associate with the PBI, but then you might review code which has been changed with a later checkin and the dev maybe has already fixed the issue. Or you review the diff of the latest changeset of the PBI with the first but then you might also review changes of other PBIs. Jakob Leander | Sr. Director, Avanade In my experience, manual code review: 1. Does not get done and at the very least does not get redone after changes (regardless of intentions at start of project) 2. When a project actually do it, they often do not do it right away = errors pile up 3. Requires a lot of time discussing/defining the standard and for the team to learn it However code review is very important since e.g. even small memory leaks in a high volume web solution have big consequences In the last years I have advocated following approach for code review - Architects up front do “at least one best practice example” of each type of component and tell the team. Copy from this one. This should include error handling, logging, security etc. - Dev lead on project continuously browse code to validate that the best practices are used. Especially that patterns etc. are not broken. You can do this formally after each sprint/iteration if you want. Once this is validated it is unlikely to “go bad” even during later code changes Agree with customer to rely on static code analysis from Visual Studio as the one and only coding standard. This has HUUGE benefits - You can easily tweak to reach the level you desire together with customer - It is easy to measure for both developers/management - It is 100% consistent across code base - It gets validated all the time so you never end up getting hammered by a customer review in the end - It is easy to tell the developer that you do not want code back unless it has zero errors = minimize communication You need to track this at least during nightly builds and make sure team sees total # issues. Do not allow #issues it to grow uncontrolled. On the project I run I require code analysis to have run on code before checkin (checkin rule). This means -  You have to have clean compile (or CA wont run) so this is extra benefit = very few broken builds - You can change a few of the rules to compile as errors instead of warnings. I often do this for “missing dispose” issues which you REALLY do not want in your app Tip: Place your custom CA rules files as part of solution. That  way it works when you do branching etc. (path to CA file is relative in VS) Some may argue that CA is not as good as manual inspection. But since manual inspection in reality suffers from the 3 issues in start it is IMO a MUCH better (and much cheaper) approach from helicopter perspective Tirthankar Dutta | Director, Avanade I think code review should be run both before and after check ins. There are some code metrics that are meant to be run on the entire codebase … Also, especially on multi-site projects, one should strive to architect in a way that lets men manage the framework while boys write the repetitive code… scales very well with the need to review less by containment and imposing architectural restrictions to emphasise the design. Bruno Capuano | Microsoft ALM MVP For code reviews (means peer reviews) in distributed team I use http://www.vsanywhere.com/default.aspx  David Jobling | Global Sr. Director, Avanade Peer review is the only way to scale and its a great practice for all in the team to learn to perform and accept. In my experience you soon learn who's code to watch more than others and tune the attention. Mikkel Toudal Kristiansen | Manager, Avanade If you have several branches in your code base, you will need to merge often. This requires manual merging, when a file has been changed in both branches. It offers a good opportunity to actually review to changed code. So my advice is: Merging between branches should be done as often as possible, it should be done by a senior developer, and he/she should perform a full code review of the code being merged. As for detecting architectural smells and code smells creeping into the code base, one really good third party tools exist: Ndepend (http://www.ndepend.com/, for static code analysis of the current state of the code base). You could also consider adding StyleCop to the solution. Jesse Houwing | Visual Studio ALM Ranger I gave a presentation on this subject on the TechDays conference in NL last year. See my presentation and slides here (talk in Dutch, but English presentation): http://blog.jessehouwing.nl/2012/03/did-you-miss-my-techdaysnl-talk-on-code.html  I’d like to add a few more points: - Before/After checking is mostly a trust issue. If you have a team that does diligent peer reviews and regularly talk/sit together or peer review, there’s no need to enforce a before-checkin policy. The peer peer-programming and regular feedback during development can take care of most of the review requirements as long as the team isn’t under stress. - Under stress, enforce pre-checkin reviews, it might sound strange, if you’re already under time or budgetary constraints, but it is under such conditions most real issues start to be created or pile up. - Use tools to catch most common errors, Code Analysis/FxCop was already mentioned. HP Fortify, Resharper, Coderush etc can help you there. There are also a lot of 3rd party rules you can add to Code Analysis. I’ve written a few myself (http://fccopcontrib.codeplex.com) and various teams from Microsoft have added their own rules (MSOCAF for SharePoint, WSSF for WCF). For common errors that keep cropping up, see if you can define a rule. It’s much easier. But more importantly make sure you have a good help page explaining *WHY* it's wrong. If you have small feature or developer branches/shelvesets, you might want to review pre-merge. It’s still better to do peer reviews and peer programming, but the most important thing is that bad quality code doesn’t make it into the important branch. So my philosophy: - Use tooling as much as possible. - Make sure the team understands the tooling and the importance of the things it flags. It’s too easy to just click suppress all to ignore the warnings. - Under stress, tighten process, it’s under stress that the problems of late reviews will really surface - Most importantly if you do reviews do them as early as possible, but never later than needed. In other words, pre-checkin/post checking doesn’t really matter, as long as the review is done before the code is released. It’ll just be much more expensive to fix any review outcomes the later you find them. --- I would love to hear what you think!

    Read the article

  • IOC Container Handling State Params in Non-Default Constructor

    - by Mystagogue
    For the purpose of this discussion, there are two kinds of parameters an object constructor might take: state dependency or service dependency. Supplying a service dependency with an IOC container is easy: DI takes over. But in contrast, state dependencies are usually only known to the client. That is, the object requestor. It turns out that having a client supply the state params through an IOC Container is quite painful. I will show several different ways to do this, all of which have big problems, and ask the community if there is another option I'm missing. Let's begin: Before I added an IOC container to my project code, I started with a class like this: class Foobar { //parameters are state dependencies, not service dependencies public Foobar(string alpha, int omega){...}; //...other stuff } I decide to add a Logger service depdendency to the Foobar class, which perhaps I'll provide through DI: class Foobar { public Foobar(string alpha, int omega, ILogger log){...}; //...other stuff } But then I'm also told I need to make class Foobar itself "swappable." That is, I'm required to service-locate a Foobar instance. I add a new interface into the mix: class Foobar : IFoobar { public Foobar(string alpha, int omega, ILogger log){...}; //...other stuff } When I make the service locator call, it will DI the ILogger service dependency for me. Unfortunately the same is not true of the state dependencies Alpha and Omega. Some containers offer a syntax to address this: //Unity 2.0 pseudo-ish code: myContainer.Resolve<IFoobar>( new parameterOverride[] { {"alpha", "one"}, {"omega",2} } ); I like the feature, but I don't like that it is untyped and not evident to the developer what parameters must be passed (via intellisense, etc). So I look at another solution: //This is a "boiler plate" heavy approach! class Foobar : IFoobar { public Foobar (string alpha, int omega){...}; //...stuff } class FoobarFactory : IFoobarFactory { public IFoobar IFoobarFactory.Create(string alpha, int omega){ return new Foobar(alpha, omega); } } //fetch it... myContainer.Resolve<IFoobarFactory>().Create("one", 2); The above solves the type-safety and intellisense problem, but it (1) forced class Foobar to fetch an ILogger through a service locator rather than DI and (2) it requires me to make a bunch of boiler-plate (XXXFactory, IXXXFactory) for all varieties of Foobar implementations I might use. Should I decide to go with a pure service locator approach, it may not be a problem. But I still can't stand all the boiler-plate needed to make this work. So then I try this: //code named "concrete creator" class Foobar : IFoobar { public Foobar(string alpha, int omega, ILogger log){...}; static IFoobar Create(string alpha, int omega){ //unity 2.0 pseudo-ish code. Assume a common //service locator, or singleton holds the container... return Container.Resolve<IFoobar>( new parameterOverride[] {{"alpha", alpha},{"omega", omega} } ); } //Get my instance: Foobar.Create("alpha",2); I actually don't mind that I'm using the concrete "Foobar" class to create an IFoobar. It represents a base concept that I don't expect to change in my code. I also don't mind the lack of type-safety in the static "Create", because it is now encapsulated. My intellisense is working too! Any concrete instance made this way will ignore the supplied state params if they don't apply (a Unity 2.0 behavior). Perhaps a different concrete implementation "FooFoobar" might have a formal arg name mismatch, but I'm still pretty happy with it. But the big problem with this approach is that it only works effectively with Unity 2.0 (a mismatched parameter in Structure Map will throw an exception). So it is good only if I stay with Unity. The problem is, I'm beginning to like Structure Map a lot more. So now I go onto yet another option: class Foobar : IFoobar, IFoobarInit { public Foobar(ILogger log){...}; public IFoobar IFoobarInit.Initialize(string alpha, int omega){ this.alpha = alpha; this.omega = omega; return this; } } //now create it... IFoobar foo = myContainer.resolve<IFoobarInit>().Initialize("one", 2) Now with this I've got a somewhat nice compromise with the other approaches: (1) My arguments are type-safe / intellisense aware (2) I have a choice of fetching the ILogger via DI (shown above) or service locator, (3) there is no need to make one or more seperate concrete FoobarFactory classes (contrast with the verbose "boiler-plate" example code earlier), and (4) it reasonably upholds the principle "make interfaces easy to use correctly, and hard to use incorrectly." At least it arguably is no worse than the alternatives previously discussed. One acceptance barrier yet remains: I also want to apply "design by contract." Every sample I presented was intentionally favoring constructor injection (for state dependencies) because I want to preserve "invariant" support as most commonly practiced. Namely, the invariant is established when the constructor completes. In the sample above, the invarient is not established when object construction completes. As long as I'm doing home-grown "design by contract" I could just tell developers not to test the invariant until the Initialize(...) method is called. But more to the point, when .net 4.0 comes out I want to use its "code contract" support for design by contract. From what I read, it will not be compatible with this last approach. Curses! Of course it also occurs to me that my entire philosophy is off. Perhaps I'd be told that conjuring a Foobar : IFoobar via a service locator implies that it is a service - and services only have other service dependencies, they don't have state dependencies (such as the Alpha and Omega of these examples). I'm open to listening to such philosophical matters as well, but I'd also like to know what semi-authorative reference to read that would steer me down that thought path. So now I turn it to the community. What approach should I consider that I havn't yet? Must I really believe I've exhausted my options?

    Read the article

  • More Animation - Self Dismissing Dialogs

    - by Duncan Mills
    In my earlier articles on animation, I discussed various slide, grow and  flip transitions for items and containers.  In this article I want to discuss a fade animation and specifically the use of fades and auto-dismissal for informational dialogs.  If you use a Mac, you may be familiar with Growl as a notification system, and the nice way that messages that are informational just fade out after a few seconds. So in this blog entry I wanted to discuss how we could make an ADF popup behave in the same way. This can be an effective way of communicating information to the user without "getting in the way" with modal alerts. This of course, has been done before, but everything I've seen previously requires something like JQuery to be in the mix when we don't really need it to be.  The solution I've put together is nice and generic and will work with either <af:panelWindow> or <af:dialog> as a the child of the popup. In terms of usage it's pretty simple to use we  just need to ensure that the popup itself has clientComponent is set to true and includes the animation JavaScript (animateFadingPopup) on a popupOpened event: <af:popup id="pop1" clientComponent="true">   <af:panelWindow title="A Fading Message...">    ...  </af:panelWindow>   <af:clientListener method="animateFadingPopup" type="popupOpened"/> </af:popup>   The popup can be invoked in the normal way using showPopupBehavior or JavaScript, no special code is required there. As a further twist you can include an additional clientAttribute called preFadeDelay to define a delay before the fade itself starts (the default is 5 seconds) . To set the delay to just 2 seconds for example: <af:popup ...>   ...   <af:clientAttribute name="preFadeDelay" value="2"/>   <af:clientListener method="animateFadingPopup" type="popupOpened"/>  </af:popup> The Animation Styles  As before, we have a couple of CSS Styles which define the animation, I've put these into the skin in my case, and, as in the other articles, I've only defined the transitions for WebKit browsers (Chrome, Safari) at the moment. In this case, the fade is timed at 5 seconds in duration. .popupFadeReset {   opacity: 1; } .popupFadeAnimate {   opacity: 0;   -webkit-transition: opacity 5s ease-in-out; } As you can see here, we are achieving the fade by simply setting the CSS opacity property. The JavaScript The final part of the puzzle is, of course, the JavaScript, there are four functions, these are generic (apart from the Style names which, if you've changed above, you'll need to reflect here): The initial function invoked from the popupOpened event,  animateFadingPopup which starts a timer and provides the initial delay before we start to fade the popup. The function that applies the fade animation to the popup - initiatePopupFade. The callback function - closeFadedPopup used to reset the style class and correctly hide the popup so that it can be invoked again and again.   A utility function - findFadeContainer, which is responsible for locating the correct child component of the popup to actually apply the style to. Function - animateFadingPopup This function, as stated is the one hooked up to the popupOpened event via a clientListener. Because of when the code is called it does not actually matter how you launch the popup, or if the popup is re-used from multiple places. All usages will get the fade behavior. /**  * Client listener which will kick off the animation to fade the dialog and register  * a callback to correctly reset the popup once the animation is complete  * @param event  */ function animateFadingPopup(event) { var fadePopup = event.getSource();   var fadeCandidate = false;   //Ensure that the popup is initially Opaque   //This handles the situation where the user has dismissed   //the popup whilst it was in the process of fading   var fadeContainer = findFadeContainer(fadePopup);   if (fadeContainer != null) {     fadeCandidate = true;     fadeContainer.setStyleClass("popupFadeReset");   }   //Only continue if we can actually fade this popup   if (fadeCandidate) {   //See if a delay has been specified     var waitTimeSeconds = event.getSource().getProperty('preFadeDelay');     //Default to 5 seconds if not supplied     if (waitTimeSeconds == undefined) {     waitTimeSeconds = 5;     }     // Now call the fade after the specified time     var fadeFunction = function () {     initiatePopupFade(fadePopup);     };     var fadeDelayTimer = setTimeout(fadeFunction, (waitTimeSeconds * 1000));   } } The things to note about this function is the initial check that we have to do to ensure that the container is currently visible and reset it's style to ensure that it is.  This is to handle the situation where the popup has begun the fade, and yet the user has still explicitly dismissed the popup before it's complete and in doing so has prevented the callback function (described later) from executing. In this particular situation the initial display of the dialog will be (apparently) missing it's normal animation but at least it becomes visible to the user (and most users will probably not notice this difference in any case). You'll notice that the style that we apply to reset the  opacity - popupFadeReset, is not applied to the popup component itself but rather the dialog or panelWindow within it. More about that in the description of the next function findFadeContainer(). Finally, assuming that we have a suitable candidate for fading, a JavaScript  timer is started using the specified preFadeDelay wait time (or 5 seconds if that was not supplied). When this timer expires then the main animation styleclass will be applied using the initiatePopupFade() function Function - findFadeContainer As a component, the <af:popup> does not support styleClass attribute, so we can't apply the animation style directly.  Instead we have to look for the container within the popup which defines the window object that can have a style attached.  This is achieved by the following code: /**  * The thing we actually fade will be the only child  * of the popup assuming that this is a dialog or window  * @param popup  * @return the component, or null if this is not valid for fading  */ function findFadeContainer(popup) { var children = popup.getDescendantComponents();   var fadeContainer = children[0];   if (fadeContainer != undefined) {   var compType = fadeContainer.getComponentType();     if (compType == "oracle.adf.RichPanelWindow" || compType == "oracle.adf.RichDialog") {     return fadeContainer;     }   }   return null; }  So what we do here is to grab the first child component of the popup and check its type. Here I decided to limit the fade behaviour to only <af:dialog> and <af:panelWindow>. This was deliberate.  If  we apply the fade to say an <af:noteWindow> you would see the text inside the balloon fade, but the balloon itself would hang around until the fade animation was over and then hide.  It would of course be possible to make the code smarter to walk up the DOM tree to find the correct <div> to apply the style to in order to hide the whole balloon, however, that means that this JavaScript would then need to have knowledge of the generated DOM structure, something which may change from release to release, and certainly something to avoid. So, all in all, I think that this is an OK restriction and frankly it's windows and dialogs that I wanted to fade anyway, not balloons and menus. You could of course extend this technique and handle the other types should you really want to. One thing to note here is the selection of the first (children[0]) child of the popup. It does not matter if there are non-visible children such as clientListener before the <af:dialog> or <af:panelWindow> within the popup, they are not included in this array, so picking the first element in this way seems to be fine, no matter what the underlying ordering is within the JSF source.  If you wanted a super-robust version of the code you might want to iterate through the children array of the popup to check for the right type, again it's up to you.  Function -  initiatePopupFade  On to the actual fading. This is actually very simple and at it's heart, just the application of the popupFadeAnimate style to the correct component and then registering a callback to execute once the fade is done. /**  * Function which will kick off the animation to fade the dialog and register  * a callback to correctly reset the popup once the animation is complete  * @param popup the popup we are animating  */ function initiatePopupFade(popup) { //Only continue if the popup has not already been dismissed    if (popup.isPopupVisible()) {   //The skin styles that define the animation      var fadeoutAnimationStyle = "popupFadeAnimate";     var fadeAnimationResetStyle = "popupFadeReset";     var fadeContainer = findFadeContainer(popup);     if (fadeContainer != null) {     var fadeContainerReal = AdfAgent.AGENT.getElementById(fadeContainer.getClientId());       //Define the callback this will correctly reset the popup once it's disappeared       var fadeCallbackFunction = function (event) {       closeFadedPopup(popup, fadeContainer, fadeAnimationResetStyle);         event.target.removeEventListener("webkitTransitionEnd", fadeCallbackFunction);       };       //Initiate the fade       fadeContainer.setStyleClass(fadeoutAnimationStyle);       //Register the callback to execute once fade is done       fadeContainerReal.addEventListener("webkitTransitionEnd", fadeCallbackFunction, false);     }   } } I've added some extra checks here though. First of all we only start the whole process if the popup is still visible. It may be that the user has closed the popup before the delay timer has finished so there is no need to start animating in that case. Again we use the findFadeContainer() function to locate the correct component to apply the style to, and additionally we grab the DOM id that represents that container.  This physical ID is required for the registration of the callback function. The closeFadedPopup() call is then registered on the callback so as to correctly close the now transparent (but still there) popup. Function -  closeFadedPopup The final function just cleans things up: /**  * Callback function to correctly cancel and reset the style in the popup  * @param popup id of the popup so we can close it properly  * @param contatiner the window / dialog within the popup to actually style  * @param resetStyle the syle that sets the opacity back to solid  */ function closeFadedPopup(popup, container, resetStyle) { container.setStyleClass(resetStyle);   popup.cancel(); }  First of all we reset the style to make the popup contents opaque again and then we cancel the popup.  This will ensure that any of your user code that is waiting for a popup cancelled event will actually get the event, additionally if you have done this as a modal window / dialog it will ensure that the glasspane is dismissed and you can interact with the UI again.  What's Next? There are several ways in which this technique could be used, I've been working on a popup here, but you could apply the same approach to in-line messages. As this code (in the popup case) is generic it will make s pretty nice declarative component and maybe, if I get time, I'll look at constructing a formal Growl component using a combination of this technique, and active data push. Also, I'm sure the above code can be improved a little too.  Specifically things like registering a popup cancelled listener to handle the style reset so that we don't loose the subtle animation that takes place when the popup is opened in that situation where the user has closed the in-fade dialog.

    Read the article

  • Oracle Solaris: Zones on Shared Storage

    - by Jeff Victor
    Oracle Solaris 11.1 has several new features. At oracle.com you can find a detailed list. One of the significant new features, and the most significant new feature releated to Oracle Solaris Zones, is casually called "Zones on Shared Storage" or simply ZOSS (rhymes with "moss"). ZOSS offers much more flexibility because you can store Solaris Zones on shared storage (surprise!) so that you can perform quick and easy migration of a zone from one system to another. This blog entry describes and demonstrates the use of ZOSS. ZOSS provides complete support for a Solaris Zone that is stored on "shared storage." In this case, "shared storage" refers to fiber channel (FC) or iSCSI devices, although there is one lone exception that I will demonstrate soon. The primary intent is to enable you to store a zone on FC or iSCSI storage so that it can be migrated from one host computer to another much more easily and safely than in the past. With this blog entry, I wanted to make it easy for you to try this yourself. I couldn't assume that you have a SAN available - which is a good thing, because neither do I! What could I use, instead? [There he goes, foreshadowing again... -Ed.] Developing this entry reinforced the lesson that the solution to every lab problem is VirtualBox. Oracle VM VirtualBox (its formal name) helps here in a couple of important ways. It offers the ability to easily install multiple copies of Solaris as guests on top of any popular system (Microsoft Windows, MacOS, Solaris, Oracle Linux (and other Linuxes) etc.). It also offers the ability to create a separate virtual disk drive (VDI) that appears as a local hard disk to a guest. This virtual disk can be moved very easily from one guest to another. In other words, you can follow the steps below on a laptop or larger x86 system. Please note that the ability to use ZOSS to store a zone on a local disk is very useful for a lab environment, but not so useful for production. I do not suggest regularly moving disk drives among computers. In the method I describe below, that virtual hard disk will contain the zone that will be migrated among the (virtual) hosts. In production, you would use FC or iSCSI LUNs instead. The zonecfg(1M) man page details the syntax for each of the three types of devices. Why Migrate? Why is the migration of virtual servers important? Some of the most common reasons are: Moving a workload to a different computer so that the original computer can be turned off for extensive maintenance. Moving a workload to a larger system because the workload has outgrown its original system. If the workload runs in an environment (such as a Solaris Zone) that is stored on shared storage, you can restore the service of the workload on an alternate computer if the original computer has failed and will not reboot. You can simplify lifecycle management of a workload by developing it on a laptop, migrating it to a test platform when it's ready, and finally moving it to a production system. Concepts For ZOSS, the important new concept is named "rootzpool". You can read about it in the zonecfg(1M) man page, but here's the short version: it's the backing store (hard disk(s), or LUN(s)) that will be used to make a ZFS zpool - the zpool that will hold the zone. This zpool: contains the zone's Solaris content, i.e. the root file system does not contain any content not related to the zone can only be mounted by one Solaris instance at a time Method Overview Here is a brief list of the steps to create a zone on shared storage and migrate it. The next section shows the commands and output. You will need a host system with an x86 CPU (hopefully at least a couple of CPU cores), at least 2GB of RAM, and at least 25GB of free disk space. (The steps below will not actually use 25GB of disk space, but I don't want to lead you down a path that ends in a big sign that says "Your HDD is full. Good luck!") Configure the zone on both systems, specifying the rootzpool that both will use. The best way is to configure it on one system and then copy the output of "zonecfg export" to the other system to be used as input to zonecfg. This method reduces the chances of pilot error. (It is not necessary to configure the zone on both systems before creating it. You can configure this zone in multiple places, whenever you want, and migrate it to one of those places at any time - as long as those systems all have access to the shared storage.) Install the zone on one system, onto shared storage. Boot the zone. Provide system configuration information to the zone. (In the Real World(tm) you will usually automate this step.) Shutdown the zone. Detach the zone from the original system. Attach the zone to its new "home" system. Boot the zone. The zone can be used normally, and even migrated back, or to a different system. Details The rest of this shows the commands and output. The two hostnames are "sysA" and "sysB". Note that each Solaris guest might use a different device name for the VDI that they share. I used the device names shown below, but you must discover the device name(s) after booting each guest. In a production environment you would also discover the device name first and then configure the zone with that name. Fortunately, you can use the command "zpool import" or "format" to discover the device on the "new" host for the zone. The first steps create the VirtualBox guests and the shared disk drive. I describe the steps here without demonstrating them. Download VirtualBox and install it using a method normal for your host OS. You can read the complete instructions. Create two VirtualBox guests, each to run Solaris 11.1. Each will use its own VDI as its root disk. Install Solaris 11.1 in each guest.Install Solaris 11.1 in each guest. To install a Solaris 11.1 guest, you can either download a pre-built VirtualBox guest, and import it, or install Solaris 11.1 from the "text install" media. If you use the latter method, after booting you will not see a windowing system. To install the GUI and other important things, login and run "pkg install solaris-desktop" and take a break while it installs those important things. Life is usually easier if you install the VirtualBox Guest Additions because then you can copy and paste between the host and guests, etc. You can find the guest additions in the folder matching the version of VirtualBox you are using. You can also read the instructions for installing the guest additions. To create the zone's shared VDI in VirtualBox, you can open the storage configuration for one of the two guests, select the SATA controller, and click on the "Add Hard Disk" icon nearby. Choose "Create New Disk" and specify an appropriate path name for the file that will contain the VDI. The shared VDI must be at least 1.5 GB. Note that the guest must be stopped to do this. Add that VDI to the other guest - using its Storage configuration - so that each can access it while running. The steps start out the same, except that you choose "Choose Existing Disk" instead of "Create New Disk." Because the disk is configured on both of them, VirtualBox prevents you from running both guests at the same time. Identify device names of that VDI, in each of the guests. Solaris chooses the name based on existing devices. The names may be the same, or may be different from each other. This step is shown below as "Step 1." Assumptions In the example shown below, I make these assumptions. The guest that will own the zone at the beginning is named sysA. The guest that will own the zone after the first migration is named sysB. On sysA, the shared disk is named /dev/dsk/c7t2d0 On sysB, the shared disk is named /dev/dsk/c7t3d0 (Finally!) The Steps Step 1) Determine the name of the disk that will move back and forth between the systems. root@sysA:~# format Searching for disks...done AVAILABLE DISK SELECTIONS: 0. c7t0d0 /pci@0,0/pci8086,2829@d/disk@0,0 1. c7t2d0 /pci@0,0/pci8086,2829@d/disk@2,0 Specify disk (enter its number): ^D Step 2) The first thing to do is partition and label the disk. The magic needed to write an EFI label is not overly complicated. root@sysA:~# format -e c7t2d0 selecting c7t2d0 [disk formatted] FORMAT MENU: ... format fdisk No fdisk table exists. The default partition for the disk is: a 100% "SOLARIS System" partition Type "y" to accept the default partition, otherwise type "n" to edit the partition table. n SELECT ONE OF THE FOLLOWING: ... Enter Selection: 1 ... G=EFI_SYS 0=Exit? f SELECT ONE... ... 6 format label ... Specify Label type[1]: 1 Ready to label disk, continue? y format quit root@sysA:~# ls /dev/dsk/c7t2d0 /dev/dsk/c7t2d0 Step 3) Configure zone1 on sysA. root@sysA:~# zonecfg -z zone1 Use 'create' to begin configuring a new zone. zonecfg:zone1 create create: Using system default template 'SYSdefault' zonecfg:zone1 set zonename=zone1 zonecfg:zone1 set zonepath=/zones/zone1 zonecfg:zone1 add rootzpool zonecfg:zone1:rootzpool add storage dev:dsk/c7t2d0 zonecfg:zone1:rootzpool end zonecfg:zone1 exit root@sysA:~# oot@sysA:~# zonecfg -z zone1 info zonename: zone1 zonepath: /zones/zone1 brand: solaris autoboot: false bootargs: file-mac-profile: pool: limitpriv: scheduling-class: ip-type: exclusive hostid: fs-allowed: anet: ... rootzpool: storage: dev:dsk/c7t2d0 Step 4) Install the zone. This step takes the most time, but you can wander off for a snack or a few laps around the gym - or both! (Just not at the same time...) root@sysA:~# zoneadm -z zone1 install Created zone zpool: zone1_rpool Progress being logged to /var/log/zones/zoneadm.20121022T163634Z.zone1.install Image: Preparing at /zones/zone1/root. AI Manifest: /tmp/manifest.xml.RXaycg SC Profile: /usr/share/auto_install/sc_profiles/enable_sci.xml Zonename: zone1 Installation: Starting ... Creating IPS image Startup linked: 1/1 done Installing packages from: solaris origin: http://pkg.us.oracle.com/support/ DOWNLOAD PKGS FILES XFER (MB) SPEED Completed 183/183 33556/33556 222.2/222.2 2.8M/s PHASE ITEMS Installing new actions 46825/46825 Updating package state database Done Updating image state Done Creating fast lookup database Done Installation: Succeeded Note: Man pages can be obtained by installing pkg:/system/manual done. Done: Installation completed in 1696.847 seconds. Next Steps: Boot the zone, then log into the zone console (zlogin -C) to complete the configuration process. Log saved in non-global zone as /zones/zone1/root/var/log/zones/zoneadm.20121022T163634Z.zone1.install Step 5) Boot the Zone. root@sysA:~# zoneadm -z zone1 boot Step 6) Login to zone's console to complete the specification of system information. root@sysA:~# zlogin -C zone1 Answer the usual questions and wait for a login prompt. Then you can end the console session with the usual "~." incantation. Step 7) Shutdown the zone so it can be "moved." root@sysA:~# zoneadm -z zone1 shutdown Step 8) Detach the zone so that the original global zone can't use it. root@sysA:~# zoneadm list -cv ID NAME STATUS PATH BRAND IP 0 global running / solaris shared - zone1 installed /zones/zone1 solaris excl root@sysA:~# zpool list NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT rpool 17.6G 11.2G 6.47G 63% 1.00x ONLINE - zone1_rpool 1.98G 484M 1.51G 23% 1.00x ONLINE - root@sysA:~# zoneadm -z zone1 detach Exported zone zpool: zone1_rpool Step 9) Review the result and shutdown sysA so that sysB can use the shared disk. root@sysA:~# zpool list NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT rpool 17.6G 11.2G 6.47G 63% 1.00x ONLINE - root@sysA:~# zoneadm list -cv ID NAME STATUS PATH BRAND IP 0 global running / solaris shared - zone1 configured /zones/zone1 solaris excl root@sysA:~# init 0 Step 10) Now boot sysB and configure a zone with the parameters shown above in Step 1. (Again, the safest method is to use "zonecfg ... export" on sysA as described in section "Method Overview" above.) The one difference is the name of the rootzpool storage device, which was shown in the list of assumptions, and which you must determine by booting sysB and using the "format" or "zpool import" command. When that is done, you should see the output shown next. (I used the same zonename - "zone1" - in this example, but you can choose any valid zonename you want.) root@sysB:~# zoneadm list -cv ID NAME STATUS PATH BRAND IP 0 global running / solaris shared - zone1 configured /zones/zone1 solaris excl root@sysB:~# zonecfg -z zone1 info zonename: zone1 zonepath: /zones/zone1 brand: solaris autoboot: false bootargs: file-mac-profile: pool: limitpriv: scheduling-class: ip-type: exclusive hostid: fs-allowed: anet: linkname: net0 ... rootzpool: storage: dev:dsk/c7t3d0 Step 11) Attaching the zone automatically imports the zpool. root@sysB:~# zoneadm -z zone1 attach Imported zone zpool: zone1_rpool Progress being logged to /var/log/zones/zoneadm.20121022T184034Z.zone1.attach Installing: Using existing zone boot environment Zone BE root dataset: zone1_rpool/rpool/ROOT/solaris Cache: Using /var/pkg/publisher. Updating non-global zone: Linking to image /. Processing linked: 1/1 done Updating non-global zone: Auditing packages. No updates necessary for this image. Updating non-global zone: Zone updated. Result: Attach Succeeded. Log saved in non-global zone as /zones/zone1/root/var/log/zones/zoneadm.20121022T184034Z.zone1.attach root@sysB:~# zoneadm -z zone1 boot root@sysB:~# zlogin zone1 [Connected to zone 'zone1' pts/2] Oracle Corporation SunOS 5.11 11.1 September 2012 Step 12) Now let's migrate the zone back to sysA. Create a file in zone1 so we can verify it exists after we migrate the zone back, then begin migrating it back. root@zone1:~# ls /opt root@zone1:~# touch /opt/fileA root@zone1:~# ls -l /opt/fileA -rw-r--r-- 1 root root 0 Oct 22 14:47 /opt/fileA root@zone1:~# exit logout [Connection to zone 'zone1' pts/2 closed] root@sysB:~# zoneadm -z zone1 shutdown root@sysB:~# zoneadm -z zone1 detach Exported zone zpool: zone1_rpool root@sysB:~# init 0 Step 13) Back on sysA, check the status. Oracle Corporation SunOS 5.11 11.1 September 2012 root@sysA:~# zoneadm list -cv ID NAME STATUS PATH BRAND IP 0 global running / solaris shared - zone1 configured /zones/zone1 solaris excl root@sysA:~# zpool list NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT rpool 17.6G 11.2G 6.47G 63% 1.00x ONLINE - Step 14) Re-attach the zone back to sysA. root@sysA:~# zoneadm -z zone1 attach Imported zone zpool: zone1_rpool Progress being logged to /var/log/zones/zoneadm.20121022T190441Z.zone1.attach Installing: Using existing zone boot environment Zone BE root dataset: zone1_rpool/rpool/ROOT/solaris Cache: Using /var/pkg/publisher. Updating non-global zone: Linking to image /. Processing linked: 1/1 done Updating non-global zone: Auditing packages. No updates necessary for this image. Updating non-global zone: Zone updated. Result: Attach Succeeded. Log saved in non-global zone as /zones/zone1/root/var/log/zones/zoneadm.20121022T190441Z.zone1.attach root@sysA:~# zpool list NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT rpool 17.6G 11.2G 6.47G 63% 1.00x ONLINE - zone1_rpool 1.98G 491M 1.51G 24% 1.00x ONLINE - root@sysA:~# zoneadm -z zone1 boot root@sysA:~# zlogin zone1 [Connected to zone 'zone1' pts/2] Oracle Corporation SunOS 5.11 11.1 September 2012 root@zone1:~# zpool list NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT rpool 1.98G 538M 1.46G 26% 1.00x ONLINE - Step 15) Check for the file created on sysB, earlier. root@zone1:~# ls -l /opt total 1 -rw-r--r-- 1 root root 0 Oct 22 14:47 fileA Next Steps Here is a brief list of some of the fun things you can try next. Add space to the zone by adding a second storage device to the rootzpool. Make sure that you add it to the configurations of both zones! Create a new zone, specifying two disks in the rootzpool when you first configure the zone. When you install that zone, or clone it from another zone, zoneadm uses those two disks to create a mirrored pool. (Three disks will result in a three-way mirror, etc.) Conclusion Hopefully you have seen the ease with which you can now move Solaris Zones from one system to another.

    Read the article

  • Part 2&ndash;Load Testing In The Cloud

    - by Tarun Arora
    Welcome to Part 2, In Part 1 we discussed the advantages of creating a Test Rig in the cloud, the Azure edge and the Test Rig Topology we want to get to. In Part 2, Let’s start by understanding the components of Azure we’ll be making use of followed by manually putting them together to create the test rig, so… let’s get down dirty start setting up the Test Rig.  What Components of Azure will I be using for building the Test Rig in the Cloud? To run the Test Agents we’ll make use of Windows Azure Compute and to enable communication between Test Controller and Test Agents we’ll make use of Windows Azure Connect.  Azure Connect The Test Controller is on premise and the Test Agents are in the cloud (How will they talk?). To enable communication between the two, we’ll make use of Windows Azure Connect. With Windows Azure Connect, you can use a simple user interface to configure IPsec protected connections between computers or virtual machines (VMs) in your organization’s network, and roles running in Windows Azure. With this you can now join Windows Azure role instances to your domain, so that you can use your existing methods for domain authentication, name resolution, or other domain-wide maintenance actions. For more details refer to an overview of Windows Azure connect. A very useful video explaining everything you wanted to know about Windows Azure connect.  Azure Compute Windows Azure compute provides developers a platform to host and manage applications in Microsoft’s data centres across the globe. A Windows Azure application is built from one or more components called ‘roles.’ Roles come in three different types: Web role, Worker role, and Virtual Machine (VM) role, we’ll be using the Worker role to set up the Test Agents. A very nice blog post discussing the difference between the 3 role types. Developers are free to use the .NET framework or other software that runs on Windows with the Worker role or Web role. Developers can also create applications using languages such as PHP and Java. More on Windows Azure Compute. Each Windows Azure compute instance represents a virtual server... Virtual Machine Size CPU Cores Memory Cost Per Hour Extra Small Shared 768 MB $0.04 Small 1 1.75 GB $0.12 Medium 2 3.50 GB $0.24 Large 4 7.00 GB $0.48 Extra Large 8 14.00 GB $0.96   You might want to review the Windows Azure Pricing FAQ. Let’s Get Started building the Test Rig… Configuration Machine Role Comments VM – 1 Domain Controller for Playpit.com On Premise VM – 2 TFS, Test Controller On Premise VM – 3 Test Agent Cloud   In this blog post I would assume that you have the domain, Team Foundation Server and Test Controller Installed and set up already. If not, please refer to the TFS 2010 Installation Guide and this walkthrough on MSDN to set up your Test Controller. You can also download a preconfigured TFS 2010 VM from Brian Keller's blog, Brian also has some great hands on Labs on TFS 2010 that you may want to explore. I. Lets start building VM – 3: The Test Agent Download the Windows Azure SDK and Tools Open Visual Studio and create a new Windows Azure Project using the Cloud Template                   Choose the Worker Role for reasons explained in the earlier post         The WorkerRole.cs implements the Run() and OnStart() methods, no code changes required. You should be able to compile the project and run it in the compute emulator (The compute emulator should have been installed as part of the Windows Azure Toolkit) on your local machine.                   We will only be making changes to WindowsAzureProject, open ServiceDefinition.csdef. Ensure that the vmsize is small (remember the cost chart above). Import the “Connect” module. I am importing the Connect module because I need to join the Worker role VM to the Playpit domain. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> <WorkerRole name="WorkerRole1" vmsize="Small"> <Imports> <Import moduleName="Diagnostics" /> <Import moduleName="Connect"/> </Imports> </WorkerRole> </ServiceDefinition> Go to the ServiceConfiguration.Cloud.cscfg and note that settings with key ‘Microsoft.WindowsAzure.Plugins.Connect.%%%%’ have been added to the configuration file. This is because you decided to import the connect module. See the config below. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> </ConfigurationSettings> </Role> </ServiceConfiguration>             Let’s go step by step and understand all the highlighted parameters and where you can find the values for them.       osFamily – By default this is set to 1 (Windows Server 2008 SP2). Change this to 2 if you want the Windows Server 2008 R2 operating system. The Advantage of using osFamily = “2” is that you get Powershell 2.0 rather than Powershell 1.0. In Powershell 2.0 you could simply use “powershell -ExecutionPolicy Unrestricted ./myscript.ps1” and it will work while in Powershell 1.0 you will have to change the registry key by including the following in your command file “reg add HKLM\Software\Microsoft\PowerShell\1\ShellIds\Microsoft.PowerShell /v ExecutionPolicy /d Unrestricted /f” before you can execute any power shell. The other reason you might want to move to os2 is if you wanted IIS 7.5.       Activation Token – To enable communication between the on premise machine and the Windows Azure Worker role VM both need to have the same token. Log on to Windows Azure Management Portal, click on Connect, click on Get Activation Token, this should give you the activation token, copy the activation token to the clipboard and paste it in the configuration file. Note – Later in the blog I’ll be showing you how to install connect on the on premise machine.                       EnableDomainJoin – Set the value to true, ofcourse we want to join the on windows azure worker role VM to the domain.       DomainFQDN, DomainControllerFQDN, DomainAccountName, DomainPassword, DomainOU, Administrators – This information is specific to your domain. I have extracted this information from the ‘service manager’ and ‘Active Directory Users and Computers’. Also, i created a new Domain-OU namely ‘CloudInstances’ so all my cloud instances joined to my domain show up here, this is optional. You can encrypt the DomainPassword – refer to the instructions here. Or hold fire, I’ll be covering that when i come to certificates and encryption in the coming section.       Now once you have filled all this information up, the configuration file should look something like below, <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="2" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="45f55fea-f194-4fbc-b36e-25604faac784" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="WIN-KUDQMQFGQOL.play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="************************" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="OU=CloudInstances, DC=Play, DC=Pit, DC=com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="Playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> </ConfigurationSettings> </Role> </ServiceConfiguration> Next we will be enabling the Remote Desktop module in to the ServiceDefinition.csdef, we could make changes manually or allow a beautiful wizard to help us make changes. I prefer the second option. So right click on the Windows Azure project and choose Publish       Now once you get the publish wizard, if you haven’t already you would be asked to import your Windows Azure subscription, this is simply the Msdn subscription activation key xml. Once you have done click Next to go to the Settings page and check ‘Enable Remote Desktop for all roles’.       As soon as you do that you get another pop up asking you the details for the user that you would be logging in with (make sure you enter a reasonable expiry date, you do not want the user account to expire today). Notice the more information tag at the bottom, click that to get access to the certificate section. See screen shot below.       From the drop down select the option to create a new certificate        In the pop up window enter the friendly name for your certificate. In my case I entered ‘WAC – Test Rig’ and click ok. This will create a new certificate for you. Click on the view button to see the certificate details. Do you see the Thumbprint, this is the value that will go in the config file (very important). Now click on the Copy to File button to copy the certificate, we will need to import the certificate to the windows Azure Management portal later. So, make sure you save it a safe location.                                Click Finish and enter details of the user you would like to create with permissions for remote desktop access, once you have entered the details on the ‘Remote desktop configuration’ screen click on Ok. From the Publish Windows Azure Wizard screen press Cancel. Cancel because we don’t want to publish the role just yet and Yes because we want to save all the changes in the config file.       Now if you go to the ServiceDefinition.csdef file you will see that the RemoteAccess and RemoteForwarder roles have been imported for you. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> <WorkerRole name="WorkerRole1" vmsize="Small"> <Imports> <Import moduleName="Diagnostics" /> <Import moduleName="Connect" /> <Import moduleName="RemoteAccess" /> <Import moduleName="RemoteForwarder" /> </Imports> </WorkerRole> </ServiceDefinition> Now go to the ServiceConfiguration.Cloud.cscfg file and you see a whole bunch for setting “Microsoft.WindowsAzure.Plugins.RemoteAccess.%%%” values added for you. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="2" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="45f55fea-f194-4fbc-b36e-25604faac784" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="WIN-KUDQMQFGQOL.play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="************************" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="OU=CloudInstances, DC=Play, DC=Pit, DC=com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="Playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.Enabled" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountUsername" value="Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountEncryptedPassword" value="MIIBnQYJKoZIhvcNAQcDoIIBjjCCAYoCAQAxggFOMIIBSgIBADAyMB4xHDAaBgNVBAMME1dpbmRvd 3MgQXp1cmUgVG9vbHMCEGa+B46voeO5T305N7TSG9QwDQYJKoZIhvcNAQEBBQAEggEABg4ol5Xol66Ip6QKLbAPWdmD4ae ADZ7aKj6fg4D+ATr0DXBllZHG5Umwf+84Sj2nsPeCyrg3ZDQuxrfhSbdnJwuChKV6ukXdGjX0hlowJu/4dfH4jTJC7sBWS AKaEFU7CxvqYEAL1Hf9VPL5fW6HZVmq1z+qmm4ecGKSTOJ20Fptb463wcXgR8CWGa+1w9xqJ7UmmfGeGeCHQ4QGW0IDSBU6ccg vzF2ug8/FY60K1vrWaCYOhKkxD3YBs8U9X/kOB0yQm2Git0d5tFlIPCBT2AC57bgsAYncXfHvPesI0qs7VZyghk8LVa9g5IqaM Cp6cQ7rmY/dLsKBMkDcdBHuCTAzBgkqhkiG9w0BBwEwFAYIKoZIhvcNAwcECDRVifSXbA43gBApNrp40L1VTVZ1iGag+3O1" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountExpiration" value="2012-11-27T23:59:59.0000000+00:00" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteForwarder.Enabled" value="true" /> </ConfigurationSettings> <Certificates> <Certificate name="Microsoft.WindowsAzure.Plugins.RemoteAccess.PasswordEncryption" thumbprint="AA23016CF0BDFC344400B5B82706B608B92E4217" thumbprintAlgorithm="sha1" /> </Certificates> </Role> </ServiceConfiguration>          Okay let’s look at them one at a time,       Enabled - Yes, we would like to enable Remote Access.       AccountUserName – This is the user name you entered while you were on the publish windows azure role screen, as detailed above.       AccountEncrytedPassword – Try and decode that, the certificate is used to encrypt the password you specified for the user account. Remember earlier i said, either use the instructions or wait and i’ll be showing you encryption, now the user account i am using for rdp has the same password as my domain password, so i can simply copy the value of the AccountEncryptedPassword to the DomainPassword as well.       AccountExpiration – This is the expiration as you specified in the wizard earlier, make sure your account does not expire today.       Remote Forwarder – Check out the documentation, below is how I understand it, -- One role in an application that implements a remote desktop connection must import the RemoteForwarder module. The two modules work together to enable the remote desktop connections to role instances. -- If you have multiple roles defined in the service model, it does not matter which role you add the RemoteForwarder module to, but you must add it to only one of the role definitions.       Certificate – Remember the certificate thumbprint from the wizard, the on premise machine and windows azure role machine that need to speak to each other must have the same thumbprint. More on that when we install Windows Azure connect Endpoints on the on premise machine. As i said earlier, in this blog post, I’ll be showing you the manual process so i won’t be scripting any star up tasks to install the test agent or register the test agent with the TFS Server. I’ll be showing you all this cool stuff in the next blog post, that’s because it’s important to understand the manual side of it, it becomes easier for you to troubleshoot in case something fails. Having said that, the changes we have made are sufficient to spin up the Windows Azure Worker Role aka Test Agent VM, have it connected with the play.pit.com domain and have remote access enabled on it. Before we deploy the Test Agent VM we need to set up Windows Azure Connect on the TFS Server. II. Windows Azure Connect: Setting up Connect on VM – 2 i.e. TFS & Test Controller Glad you made it so far, now to enable communication between the on premise TFS/Test Controller and Azure-ed Test Agent we need to enable communication. We have set up the Azure connect module in the Test Agent configuration, now the connect end points need to be enabled on the on premise machines, let’s have a look at how we can do this. Log on to VM – 2 running the TFS Server and Test Controller Log on to the Windows Azure Management Portal and click on Virtual Network Click on Virtual Network, if you already have a subscription you should see the below screen shot, if not, you would be asked to complete the subscription first        Click on Install Local Endpoints from the top left on the panel and you get a url appended with a token id in it, remember the token i showed you earlier, in theory the token you get here should match the token you added to the Test Agent config file.        Copy the url to the clip board and paste it in IE explorer (important, the installation at present only works out of IE and you need to have cookies enabled in order to complete the installation). As stated in the pop up, you can NOT download and run the software later, you need to run it as is, since it contains a token. Once the installation completes you should see the Windows Azure connect icon in the system tray.                         Right click the Azure Connect icon, choose Diagnostics and refer to this link for diagnostic detail terminology. NOTE – Unfortunately I could not see the Windows Azure connect icon in the system tray, a bit of binging with Google revealed that the azure connect icon is only shown when the ‘Windows Azure Connect Endpoint’ Service is started. So go to services.msc and make sure that the service is started, if not start it, unfortunately again, the service did not start for me on a manual start and i realised that one of the dependant services was disabled, you can look at the service dependencies and start them and then start windows azure connect. Bottom line, you need to start Windows Azure connect service before you can proceed. Please refer here on MSDN for more on Troubleshooting Windows Azure connect. (Follow the next step as well)   Now go back to the Windows Azure Management Portal and from Groups and Roles create a new group, lets call it ‘Test Rig’. Make sure you add the VM – 2 (the TFS Server VM where you just installed the endpoint).       Now if you go back to the Azure Connect icon in the system tray and click ‘Refresh Policy’ you will notice that the disconnected status of the icon should change to ready for connection. III. Importing Certificate in to Windows Azure Management Portal But before that you need to import the certificate you created in Step I in to the Windows Azure Management Portal. Log on to the Windows Azure Management Portal and click on ‘Hosted Services, Storage Accounts & CDN’ and then ‘Management Certificates’ followed by Add Certificates as shown in the screen shot below        Browse to the location where you saved the certificate earlier, remember… Refer to Step I in case you forgot.        Now you should be able to see the imported certificate here, make sure the thumbprint of the certificate matches the one you inserted in the config files        IV. Publish Windows Azure Worker Role aka Test Agent Having completed I, II and III, you are ready to publish the Test Agent VM – 3 to the cloud. Go to Visual Studio and right click the Windows Azure project and select Publish. Verify the infomration in the wizard, from the advanced settings tab, you can also enabled capture of intellitrace or profiling information.         Click Next and Click Publish! From the view menu bar select the Windows Azure Activity Log window.       Now you should be able to see the deployment progress in real time.             In the Windows Azure Management Portal, you should also be able to see the progress of creation of a new Worker Role.       Once the deployment is complete you should be able to RDP (go to run prompt type mstsc and in the pop up the machine name) in to the Test Agent Worker Role VM from the Playpit network using the domain admin user account. In case you are unable to log in to the Test Agent using the domain admin user account it means the process of joining the Test Agent to the domain has failed! But the good news is, because you imported the connect module, you can connect to the Test Agent machine using Windows Azure Management Portal and troubleshoot the reason for failure, you will be able to log in with the user name and password you specified in the config file for the keys ‘RemoteAccess.AccountUsername, RemoteAccess.EncryptedPassword (just that enter the password unencrypted)’, fix it or manually join the machine to the domain. Once you have managed to Join the Test Agent VM to the Domain move to the next step.      So, log in to the Test Agent Worker Role VM with the Playpit Domain Administrator and verify that you can log in, the machine is connected to the domain and the connect service is successfully running. If yes, give your self a pat on the back, you are 80% mission accomplished!         Go to the Windows Azure Management Portal and click on Virtual Network, click on Groups and Roles and click on Test Rig, click Edit Group, the edit the Test Rig group you created earlier. In the Connect to section, click on Add to select the worker role you have just deployed. Also, check the ‘Allow connections between endpoints in the group’ with this you will enable to communication between test controller and test agents and test agents/test agents. Click Save.      Now, you are ready to deploy the Test Agent software on the Worker Role Test Agent VM and configure it to work with the Test Controller. V. Configuring VM – 3: Installing Test Agent and Associating Test Agent to Controller Log in to the Worker Role Test Agent VM that you have just successfully deployed, make sure you log in with the domain administrator account. Download the All Agents software from MSDN, ‘en_visual_studio_agents_2010_x86_x64_dvd_509679.iso’, extract the iso and navigate to where you have extracted the iso. In my case, i have extracted the iso to “C:\Resources\Temp\VsAgentSetup”. Open the Test Agent folder and double click on setup.exe. Once you have installed the Test Agent you should reach the configuration window. If you face any issues installing TFS Test Agent on the VM, refer to the walkthrough on MSDN.       Once you have successfully installed the Test Agent software you will need to configure the test agent. Right click the test agent configuration tool and run as a different user. i.e. an Administrator. This is really to run the configuration wizard with elevated privileges (you might have UAC block something's otherwise).        In the run options, you can select ‘service’ you do not need to run the agent as interactive un less you are running coded UI tests. I have specified the domain administrator to connect to the TFS Test Controller. In real life, i would never do that, i would create a separate test user service account for this purpose. But for the blog post, we are using the most powerful user so that any policies or restrictions don’t block you.        Click the Apply Settings button and you should be all green! If not, the summary usually gives helpful error messages that you can resolve and proceed. As per my experience, you may run in to either a permission or a firewall blocking communication issue.        And now the moment of truth! Go to VM –2 open up Visual Studio and from the Test Menu select Manage Test Controller       Mission Accomplished! You should be able to see the Test Agent that you have just configured here,         VI. Creating and Running Load Tests on your brand new Azure-ed Test Rig I have various blog posts on Performance Testing with Visual Studio Ultimate, you can follow the links and videos below, Blog Posts: - Part 1 – Performance Testing using Visual Studio 2010 Ultimate - Part 2 – Performance Testing using Visual Studio 2010 Ultimate - Part 3 – Performance Testing using Visual Studio 2010 Ultimate Videos: - Test Tools Configuration & Settings in Visual Studio - Why & How to Record Web Performance Tests in Visual Studio Ultimate - Goal Driven Load Testing using Visual Studio Ultimate Now that you have created your load tests, there is one last change you need to make before you can run the tests on your Azure Test Rig, create a new Test settings file, and change the Test Execution method to ‘Remote Execution’ and select the test controller you have configured the Worker Role Test Agent against in our case VM – 2 So, go on, fire off a test run and see the results of the test being executed on the Azur-ed Test Rig. Review and What’s next? A quick recap of the benefits of running the Test Rig in the cloud and what i will be covering in the next blog post AND I would love to hear your feedback! Advantages Utilizing the power of Azure compute to run a heavy virtual user load. Benefiting from the Azure flexibility, destroy Test Agents when not in use, takes < 25 minutes to spin up a new Test Agent. Most important test Network Latency, (network latency and speed of connection are two different things – usually network latency is very hard to test), by placing the Test Agents in Microsoft Data centres around the globe, one can actually test the lag in transferring the bytes not because of a slow connection but because the page has been requested from the other side of the globe. Next Steps The process of spinning up the Test Agents in windows Azure is not 100% automated. I am working on the Worker process and power shell scripts to make the role deployment, unattended install of test agent software and registration of the test agent to the test controller automated. In the next blog post I will show you how to make the complete process unattended and automated. Remember to subscribe to http://feeds.feedburner.com/TarunArora. Hope you enjoyed this post, I would love to hear your feedback! If you have any recommendations on things that I should consider or any questions or feedback, feel free to leave a comment. See you in Part III.   Share this post : CodeProject

    Read the article

  • Project Management Helps AmeriCares Deliver International Aid

    - by Sylvie MacKenzie, PMP
    Excerpt from PROFIT - ORACLE - by Alison Weiss Handle with Care Sound project management helps AmeriCares bring international aid to those in need. The stakes are always high for AmeriCares. On a mission to restore health and save lives during times of disaster, the nonprofit international relief and humanitarian aid organization delivers donated medicines, medical supplies, and humanitarian aid to people in the U.S. and around the globe. Founded in 1982 with the express mission of responding as quickly and efficiently as possible to help people in need, the Stamford, Connecticut-based AmeriCares has delivered more than US$10.5 billion in aid to 147 countries over the past three decades. Launch the Slideshow “It’s critically important to us that we steward all the donations and that the medical supplies and medicines get to people as quickly as possible with no loss,” says Kate Sears, senior vice president for finance and technology at AmeriCares. “Whether we’re shipping IV solutions to victims of cholera in Haiti or antibiotics to Somali famine victims, we need to get the medicines there sooner because it means more people will be helped and lives improved or even saved.” Ten years ago, the tracking systems used by AmeriCares associates were paper-based. In recent years, staff started using spreadsheets, but the tracking processes were not standardized between teams. “Every team was tracking completely different information,” says Megan McDermott, senior associate, Sub-Saharan Africa partnerships, at AmeriCares. “It was just a few key things. For example, we tracked the date a shipment was supposed to arrive and the date we got reports from our partner that a hospital received aid on their end.” While the data was accurate, much detail was being lost in the process. AmeriCares management knew it could do a better job of tracking this enterprise data and in 2011 took a significant step by implementing Oracle’s Primavera P6 Professional Project Management. “It’s a comprehensive solution that has helped us improve the monitoring and controlling processes. It has allowed us to do our distribution better,” says Sears. In addition, the implementation effort has been a change agent, helping AmeriCares leadership rethink project management across the entire organization. Initially, much of the focus was on standardizing processes, but staff members also learned the importance of thinking proactively to prevent possible problems and evaluating results to determine if goals and objectives are truly being met. Such data about process efficiency and overall results is critical not only to AmeriCares staff but also to the donors supporting the organization’s life-saving missions. Efficiency Saves Lives One of AmeriCares’ core operations is to gather product donations from the private sector, establish where the most-urgent needs are, and solicit monetary support to send the aid via ocean cargo or airlift to welfare- and health-oriented nongovernmental organizations, hospitals, health networks, and government ministries based in areas in need. In 2011 alone, AmeriCares sent more than 3,500 shipments to 95 countries in response to both ongoing humanitarian needs and more than two dozen emergencies, including deadly tornadoes and storms in the U.S. and the devastating tsunami in Japan. When it comes to nonprofits in general, donors want to know that the charitable organizations they support are using funds wisely. Typically, nonprofits are evaluated by donors in terms of efficiency, an area where AmeriCares has an excellent reputation: 98 percent of expenses go directly to supporting programs and less than 2 percent represent administrative and fundraising costs. Donors, however, should look at more than simple efficiency, says Peter York, senior partner and chief research and learning officer at TCC Group, a nonprofit consultancy headquartered in New York, New York. They should also look at whether organizations have the systems in place to sustain their missions and continue to thrive. An expert on nonprofit organizational management, York has spent years studying sustainable charitable organizations. He defines them as nonprofits that are able to achieve the ongoing financial support to stay relevant and continue doing core mission work. In his analysis of well over 2,500 larger nonprofits, York has found that many are not sustaining, and are actually scaling back in size. “One of the biggest challenges of nonprofit sustainability is the general public’s perception that every dollar donated has to go only to the delivery of service,” says York. “What our data shows is that there are some fundamental capacities that have to be there in order for organizations to sustain and grow.” York’s research highlights the importance of data-driven leadership at successful nonprofits. “You’ve got to have the tools, the systems, and the technologies to get objective information on what you do, the people you serve, and the results you’re achieving,” says York. “If leaders don’t have the knowledge and the data, they can’t make the strategic decisions about programs to take organizations to the next level.” Historically, AmeriCares associates have used time-tested and cost-effective strategies to ship and then track supplies from donation to delivery to their destinations in designated time frames. When disaster strikes, AmeriCares ships by air and generally pulls out all the stops to deliver the most urgently needed aid within the first few days and weeks. Then, as situations stabilize, AmeriCares turns to delivering sea containers for the postemergency and ongoing aid so often needed over the long term. According to McDermott, getting a shipment out the door is fairly complicated, requiring as many as five different AmeriCares teams collaborating together. The entire process can take months—from when products are received in the warehouse and deciding which recipients to allocate supplies to, to getting customs and governmental approvals in place, actually shipping products, and finally ensuring that the products are received in-country. Delivering that aid is no small affair. “Our volume exceeds half a billion dollars a year worth of donated medicines and medical supplies, so it’s a sizable logistical operation to bring these products in and get them out to the right place quickly to have the most impact,” says Sears. “We really pride ourselves on our controls and efficiencies.” Adding to that complexity is the fact that the longer it takes to deliver aid, the more dire the human need can be. Any time AmeriCares associates can shave off the complicated aid delivery process can translate into lives saved. “It’s really being able to track information consistently that will help us to see where are the bottlenecks and where can we work on improving our processes,” says McDermott. Setting a Standard Productivity and information management improvements were key objectives for AmeriCares when staff began the process of implementing Oracle’s Primavera solution. But before configuring the software, the staff needed to take the time to analyze the systems already in place. According to Greg Loop, manager of database systems at AmeriCares, the organization received guidance from several consultants, including Rich D’Addario, consulting project manager in the Primavera Global Business Unit at Oracle, who was instrumental in shepherding the critical requirements-gathering phase. D’Addario encouraged staff to begin documenting shipping processes by considering the order in which activities occur and which ones are dependent on others to get accomplished. This exercise helped everyone realize that to be more efficient, they needed to keep track of shipments in a more standard way. “The staff didn’t recognize formal project management methodology,” says D’Addario. “But they did understand what the most important things are and that if they go wrong, an entire project can go off course.” Before, if a boatload of supplies was being sent to Haiti and there was a problem somewhere, a lot of time was taken up finding out where the problem was—because staff was not tracking things in a standard way. As a result, even more time was needed to find possible solutions to the problem and alert recipients that the aid might be delayed. “For everyone to put on the project manager hat and standardize the way every single thing is done means that now the whole organization is on the same page as to what needs to occur from the time a hurricane hits Haiti and when a boat pulls in to unload supplies,” says D’Addario. With so much care taken to put a process foundation firmly in place, configuring the Primavera solution was actually quite simple. Specific templates were set up for different types of shipments, and dashboards were implemented to provide executives with clear overviews of every project in the system. AmeriCares’ Loop reports that system planning, refining, and testing, followed by writing up documentation and training, took approximately four months. The system went live in spring 2011 at AmeriCares’ Connecticut headquarters. While the nonprofit has an international presence, with warehouses in Europe and offices in Haiti, India, Japan, and Sri Lanka, most donated medicines come from U.S. entities and are shipped from the U.S. out to the rest of the world. In addition, all shipments are tracked from the U.S. office. AmeriCares doesn’t expect the Primavera system to take months off the shipping time, especially for sea containers. However, any time saved is still important because it will allow aid to be delivered to people more quickly at a lower overall cost. “If we can trim a day or two here or there, that can translate into lives that we’re saving, especially in emergency situations,” says Sears. A Cultural Change Beyond the measurable benefits that come with IT-driven process improvement, AmeriCares management is seeing a change in culture as a result of the Primavera project. One change has been treating every shipment of aid as a project, and everyone involved with facilitating shipments as a project manager. “This is a revolutionary concept for us,” says McDermott. “Before, we were used to thinking we were doing logistics—getting a container from point A to point B without looking at it as one project and really understanding what it meant to manage it.” AmeriCares staff is also happy to report that collaboration within the organization is much more efficient. When someone creates a shipment in the Primavera system, the same shared template is used, which means anyone can log in to the system to see the status of a shipment. Knowledgeable staff can access a shipment project to help troubleshoot a problem. Management can easily check the status of projects across the organization. “Dashboards are really useful,” says McDermott. “Instead of going into the details of each project, you can just see the high-level real-time information at a glance.” The new system is helping team members focus on proactively managing shipments rather than simply reacting when problems occur. For example, when a container is shipped, documents must be included for customs clearance. Now, the shipping template has built-in reminders to prompt team members to ask for copies of these documents from freight forwarders and to follow up with partners to discover if a shipment is on time. In the past, staff may not have worked on securing these documents until they’d been notified a shipment had arrived in-country. Another benefit of capturing and adopting best practices within the Primavera system is that staff training is easier. “Capturing the processes in documented steps and milestones allows us to teach new staff members how to do their jobs faster,” says Sears. “It provides them with the knowledge of their predecessors so they don’t have to keep reinventing the wheel.” With the Primavera system already generating positive results, management is eager to take advantage of advanced capabilities. Loop is working on integrating the company’s proprietary inventory management system with the Primavera system so that when logistics or warehousing operators input data, the information will automatically go into the Primavera system. In the past, this information had to be manually keyed into spreadsheets, often leading to errors. Mining Historical Data Another feature on the horizon for AmeriCares is utilizing Primavera P6 Professional Project Management reporting capabilities. As the system begins to include more historical data, management soon will be able to draw on this information to conduct analysis that has not been possible before and create customized reports. For example, at the beginning of the shipment process, staff will be able to use historical data to more accurately estimate how long the approval process should take for a particular country. This could help ensure that food and medicine with limited shelf lives do not get stuck in customs or used beyond their expiration dates. The historical data in the Primavera system will also help AmeriCares with better planning year to year. The nonprofit’s staff has always put together a plan at the beginning of the year, but this has been very challenging simply because it is impossible to predict disasters. Now, management will be able to look at historical data and see trends and statistics as they set current objectives and prepare for future need. In addition, this historical data will provide AmeriCares management with the ability to review year-end data and compare actual project results with goals set at the beginning of the year—to see if desired outcomes were achieved and if there are areas that need improvement. It’s this type of information that is so valuable to donors. And, according to York, project management software can play a critical role in generating the data to help nonprofits sustain and grow. “It is important to invest in systems to help replicate, expand, and deliver services,” says York. “Project management software can help because it encourages nonprofits to examine program or service changes and how to manage moving forward.” Sears believes that AmeriCares donors will support the return on investment the organization will achieve with the Primavera solution. “It won’t be financial returns, but rather how many more people we can help for a given dollar or how much more quickly we can respond to a need,” says Sears. “I think donors are receptive to such arguments.” And for AmeriCares, it is all about the future and increasing results. The project management environment currently may be quite simple, but IT staff plans to expand the complexity and functionality as the organization grows in its knowledge of project management and the goals it wants to achieve. “As we use the system over time, we’ll continue to refine our best practices and accumulate more data,” says Sears. “It will advance our ability to make better data-driven decisions.”

    Read the article

  • CodePlex Daily Summary for Saturday, August 18, 2012

    CodePlex Daily Summary for Saturday, August 18, 2012Popular ReleasesExtAspNet: ExtAspNet v3.1.9: +2012-08-18 v3.1.9 -??other/addtab.aspx???JS???BoundField??Tooltip???(Dennis_Liu)。 +??Window?GetShowReference???????????????(︶????、????、???、??~)。 -?????JavaScript?????,??????HTML????????。 -??HtmlNodeBuilder????????????????JavaScript??。 -??????WindowField、LinkButton、HyperLink????????????????????????????。 -???????????grid/griddynamiccolumns2.aspx(?????)。 -?????Type??Reset?????,??????????????????(e??)。 -?????????????????????。 -?????????int,short,double??????????(???)。 +?Window????Ge...XDA ROM HUB: Release v0.9.3: ChangelogAdded a new UI Removed unnecessary features Fixed a lot of bugs Adding new firmwares (Please help me) Added a new app for Android Ability to backup apps to the SD Card Ability to create Nandroid backups without a PC Ability to backup apps and restore from CWM Improved download manager Now supporting pause and resume Converting Bytes to MegaBytes Improved dialogs Removed support for 7z files Changed installer Known BugsDialog message will always appear when clic...Task Card Creator 2010: TaskCardCreator2010 4.0.2.0: What's New: UI/UX improved using a contextual ribbon tab for reports Finishing the "new 4.0 UI" Report template help improved New project branch to support TFS 2012: http://taskcardcreator2012.codeplex.com User interface made more modern (4.0.1.0) Smarter algorithm used for report generation (4.0.1.0) Quality setting added (4.0.1.0) Terms harmonized (4.0.1.0) Miscellaneous optimizations (4.0.1.0) Fixed critical issue introduced in 4.0.0.0 (4.0.1.0)AcDown????? - AcDown Downloader Framework: AcDown????? v4.0.1: ?? ●AcDown??????????、??、??????。????,????,?????????????????????????。???????????Acfun、????(Bilibili)、??、??、YouTube、??、???、??????、SF????、????????????。 ●??????AcPlay?????,??????、????????????????。 ● AcDown??????????????????,????????????????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7/8 ??:????????Windows XP???,?????????.NET Framework 2.0???(x86),?????"?????????"??? ??????????????,??????????: ??"AcDown?????"????????? ...Fluent Validation for .NET: 3.4: Changes since 3.3: Make ValidationResut.IsValid virtual Add private no-arg ctor to ValidationFailure to help with serialization Add Turkish error messages Work-around for reflection bug in .NET 4.5 that caused VerificationExceptions Assemblies are now unsigned to ease with versioning/upgrades (especially where other frameworks depend on FV) (Note if you need signed assemblies then you can use the following NuGet packages: FluentValidation-signed, FluentValidation.MVC3-signed, FluentV...fastJSON: v2.0.2: - bug fix $types and arraysAssaultCube Reloaded: 2.5.3 Unnamed Fixed: If you are using deltas, download 2.5.2 first, then overwrite with the delta packages. Linux has Ubuntu 11.10 32-bit precompiled binaries and Ubuntu 10.10 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, please wait while we try to package for those OSes. Try to compile it. If it fails, download a virtual machine. The server pack is ready for both Windows and Linux, but you might need to compile your ...Coding4Fun Tools: Coding4Fun.Phone.Toolkit v1.6.1: Bug Fix release Bug Fixes Better support for transparent images IsFrozen respected if not bound to corrected deadlock stateWPF Application Framework (WAF): WPF Application Framework (WAF) 2.5.0.7: Version: 2.5.0.7 (Milestone 7): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Changelog Legend: [B] Breaking change; [O] Marked member as obsolete WAF: Add CollectionHelper.GetNextElementOrDefault method. InfoMan: Support creating a new email and saving it in the Send b...Diablo III Drop Statistics Service: D3DSS 1.0.1: ??????IP??。 ??????????,??????????。myCollections: Version 2.2.3.0: New in this version : Added setup package. Added Amazon Spain for Apps, Books, Games, Movie, Music, Nds and Tvshow. Added TVDB Spain for Tvshow. Added TMDB Spain for Movies. Added Auto rename files from title. Added more filters when adding files (vob,mpls,ifo...) Improve Books author and Music Artist Credits. Rewrite find duplicates for better performance. You can now add Custom link to items. You can now add type directly from the type list using right mouse button. Bug ...Player Framework by Microsoft: Player Framework for Windows 8 Preview 5 (Refresh): Support for Windows 8 and Visual Studio RTM Support for Smooth Streaming SDK beta 2 Support for live playback New bitrate meter and SD/HD indicators Auto smooth streaming track restriction for snapped mode to conserve bandwidth New "Go Live" button and SeekToLive API Support for offset start times Support for Live position unique from end time Support for multiple audio streams (smooth and progressive content) Improved intellisense in JS version NEW TO PREVIEW 5 REFRESH:Req...TFS Workbench: TFS Workbench v2.2.0.10: Compiled installers for TFS Workbench 2.2.0.10 Bug Fix Fixed bug that stopped the change workspace action from working.Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.60: Allow for CSS3 grid-column and grid-row repeat syntax. Provide option for -analyze scope-report output to be in XML for easier programmatic processing; also allow for report to be saved to a separate output file.ClosedXML - The easy way to OpenXML: ClosedXML 0.67.2: v0.67.2 Fix when copying conditional formats with relative formulas v0.67.1 Misc fixes to the conditional formats v0.67.0 Conditional formats now accept formulas. Major performance improvement when opening files with merged ranges. Misc fixes.Umbraco CMS: Umbraco 4.8.1: Whats newBug fixes: Fixed: When upgrading to 4.8.0, the database upgrade didn't run Update: unfortunately, upgrading with SQLCE is problematic, there's a workaround here: http://bit.ly/TEmMJN The changes to the <imaging> section in umbracoSettings.config caused errors when you didn't apply them during the upgrade. Defaults will now be used if any keys are missing Scheduled unpublishes now only unpublishes nodes set to published rather than newest Work item: 30937 - Fixed problem with Fi...patterns & practices - Unity: Unity 3.0 for .NET 4.5 and WinRT - Preview: The Unity 3.0.1208.0 Preview enables Unity to work on .NET 4.5 with both the WinRT and desktop profiles. This is an updated version of the port after the .NET Framework 4.5 and Windows 8 have RTM'ed. Please see the Release Notes Providing feedback Post your feedback on the Unity forum Submit and vote on new features for Unity on our Uservoice site.Self-Tracking Entity Generator for WPF and Silverlight: Self-Tracking Entity Generator v 2.0.0 for VS11: Self-Tracking Entity Generator for WPF and Silverlight v 2.0.0 for Entity Framework 5.0 and Visual Studio 2012NPOI: NPOI 2.0: New features a. Implement OpenXml4Net (same as System.Packaging from Microsoft). It supports both .NET 2.0 and .NET 4.0 b. Excel 2007 read/write library (NPOI.XSSF) c. Word 2007 read/write library(NPOI.XWPF) d. NPOI.SS namespace becomes the interface shared between XSSF and HSSF e. Load xlsx template and save as new xlsx file (partially supported) f. Diagonal line in cell both in xls and xlsx g. Support isRightToLeft and setRightToLeft on the common spreadsheet Sheet interface, as per existin...BugNET Issue Tracker: BugNET 1.1: This release includes bug fixes from the 1.0 release for email notifications, RSS feeds, and several other issues. Please see the change log for a full list of changes. http://support.bugnetproject.com/Projects/ReleaseNotes.aspx?pid=1&m=76 Upgrade Notes The following changes to the web.config in the profile section have occurred: Removed <add name="NotificationTypes" type="String" defaultValue="Email" customProviderData="NotificationTypes;nvarchar;255" />Added <add name="ReceiveEmailNotifi...New Projectscocos2d simpledemo: simple demo cocos2dDiscuzViet: Discuz Vi?t Nam - discuz.vnEaseSettings: EaseSettings is a library designed to allow easy usage of settings with a highly structured in treenode-form.Face Detection & Recognition: Face Detection & Recognition project. It provide an interface for face detection and recognition FIM 2010 GoogleApps MA: Forefront Identity Manager 2010 Management Agent for Google Apps. You can synchronize users between Google Apps and Forefront Identity Manager 2010.Get System Info: Get System Information is an open source project licensed under the GNU Public License v2. This project allows you to view information about your system.go launcher: Go launchergrOOvy language (the spec): A formal specification of what the Groovy Language does.JavaScript Prototype Extension: It is a standalone JavaScript extension based on JavaScript prototype directly, to make developers use JS like C# or Java.Jump N Roll Windows Phone 7: Jump n Roll is a remake of the classic amiga game for windows phone 7, using vb.net and XNA. Version 1 is in a working but unfinished state.KingOfHorses: This is a Capstone Project from FPT University.Lan House Manager: Trabalho de conclusão de curso da faculdade de Ciências da Computação 2009 UNISANTA. -.NET -C# -WPF -ASP.NET MVC -Entity Framework -SQL ServerMy source code: This project contains my source code.MySimpleProjectForKudu: Cool stuffNWP INTRANET: NWP Intranet, a complete and useful intranet application for a wide cainds of companies.P.O.W. - PowerShell on the Web - DevOps Automation Portal: The PowerShell DevOps Portal provides a SharePoint like experience but centralized around the execution of PowerShell scripts to automate DevOps oriented tasks.Piwik for Microsoft Silverlight Analytics Framework: Piwik for Micorsoft Silverlight Analytics adds Piwik support to the MSAFPlaygroundSite: pluralsight tutSkewboid Approach to Pareto Relaxation and Filtering: For multi-objective decision-making and optimization - an adjustment parameter, alpha, is added to the conventional determination of Pareto optimality.Software Factories: Software Factory Generatorsspaceship cocos2dx demo: testing onlySpring_DWR: Present day I had a requirement where using Spring and Direct web remoting. testgit08172012git01: yuitesttom08172012tfs01: ytryytrtouchdragdemo: touch drag demoVirtual Pilot Client: An open-source pilot client for the VATSIM network. See discussions at http://forums.vatsim.net/viewforum.php?f=124 Visual Studio IDE Service - DTE: Visual Studio ServicesWCF AOP Sample: This project contains a simple implementation of AOP for WCF. It illustrates how to do simple logging and param masking using basic WCF extensions and Unity.WeatherApp: Aplicación para ver el clima????: ??????????,????

    Read the article

  • How to make negate_unary work with any type?

    - by Chan
    Hi, Following this question: How to negate a predicate function using operator ! in C++? I want to create an operator ! can work with any functor that inherited from unary_function. I tried: template<typename T> inline std::unary_negate<T> operator !( const T& pred ) { return std::not1( pred ); } The compiler complained: Error 5 error C2955: 'std::unary_function' : use of class template requires template argument list c:\program files\microsoft visual studio 10.0\vc\include\xfunctional 223 1 Graphic Error 7 error C2451: conditional expression of type 'std::unary_negate<_Fn1>' is illegal c:\program files\microsoft visual studio 10.0\vc\include\ostream 529 1 Graphic Error 3 error C2146: syntax error : missing ',' before identifier 'argument_type' c:\program files\microsoft visual studio 10.0\vc\include\xfunctional 222 1 Graphic Error 4 error C2065: 'argument_type' : undeclared identifier c:\program files\microsoft visual studio 10.0\vc\include\xfunctional 222 1 Graphic Error 2 error C2039: 'argument_type' : is not a member of 'std::basic_ostream<_Elem,_Traits>::sentry' c:\program files\microsoft visual studio 10.0\vc\include\xfunctional 222 1 Graphic Error 6 error C2039: 'argument_type' : is not a member of 'std::basic_ostream<_Elem,_Traits>::sentry' c:\program files\microsoft visual studio 10.0\vc\include\xfunctional 230 1 Graphic Any idea? Update Follow "templatetypedef" solution, I got new error: Error 3 error C2831: 'operator !' cannot have default parameters c:\visual studio 2010 projects\graphic\graphic\main.cpp 39 1 Graphic Error 2 error C2808: unary 'operator !' has too many formal parameters c:\visual studio 2010 projects\graphic\graphic\main.cpp 39 1 Graphic Error 4 error C2675: unary '!' : 'is_prime' does not define this operator or a conversion to a type acceptable to the predefined operator c:\visual studio 2010 projects\graphic\graphic\main.cpp 52 1 Graphic Update 1 Complete code: #include <iostream> #include <functional> #include <utility> #include <cmath> #include <algorithm> #include <iterator> #include <string> #include <boost/assign.hpp> #include <boost/assign/std/vector.hpp> #include <boost/assign/std/map.hpp> #include <boost/assign/std/set.hpp> #include <boost/assign/std/list.hpp> #include <boost/assign/std/stack.hpp> #include <boost/assign/std/deque.hpp> struct is_prime : std::unary_function<int, bool> { bool operator()( int n ) const { if( n < 2 ) return 0; if( n == 2 || n == 3 ) return 1; if( n % 2 == 0 || n % 3 == 0 ) return 0; int upper_bound = std::sqrt( static_cast<double>( n ) ); for( int pf = 5, step = 2; pf <= upper_bound; ) { if( n % pf == 0 ) return 0; pf += step; step = 6 - step; } return 1; } }; /* template<typename T> inline std::unary_negate<T> operator !( const T& pred, typename T::argument_type* dummy = 0 ) { return std::not1<T>( pred ); } */ inline std::unary_negate<is_prime> operator !( const is_prime& pred ) { return std::not1( pred ); } template<typename T> inline void print_con( const T& con, const std::string& ms = "", const std::string& sep = ", " ) { std::cout << ms << '\n'; std::copy( con.begin(), con.end(), std::ostream_iterator<typename T::value_type>( std::cout, sep.c_str() ) ); std::cout << "\n\n"; } int main() { using namespace boost::assign; std::vector<int> nums; nums += 1, 3, 5, 7, 9; nums.erase( remove_if( nums.begin(), nums.end(), !is_prime() ), nums.end() ); print_con( nums, "After remove all primes" ); } Thanks, Chan Nguyen

    Read the article

  • Squid + Dans Guardian (simple configuration)

    - by The Digital Ninja
    I just built a new proxy server and compiled the latest versions of squid and dansguardian. We use basic authentication to select what users are allowed outside of our network. It seems squid is working just fine and accepts my username and password and lets me out. But if i connect to dans guardian, it prompts for username and password and then displays a message saying my username is not allowed to access the internet. Its pulling my username for the error message so i know it knows who i am. The part i get confused on is i thought that part was handled all by squid, and squid is working flawlessly. Can someone please double check my config files and tell me if i'm missing something or there is some new option i must set to get this to work. dansguardian.conf # Web Access Denied Reporting (does not affect logging) # # -1 = log, but do not block - Stealth mode # 0 = just say 'Access Denied' # 1 = report why but not what denied phrase # 2 = report fully # 3 = use HTML template file (accessdeniedaddress ignored) - recommended # reportinglevel = 3 # Language dir where languages are stored for internationalisation. # The HTML template within this dir is only used when reportinglevel # is set to 3. When used, DansGuardian will display the HTML file instead of # using the perl cgi script. This option is faster, cleaner # and easier to customise the access denied page. # The language file is used no matter what setting however. # languagedir = '/etc/dansguardian/languages' # language to use from languagedir. language = 'ukenglish' # Logging Settings # # 0 = none 1 = just denied 2 = all text based 3 = all requests loglevel = 3 # Log Exception Hits # Log if an exception (user, ip, URL, phrase) is matched and so # the page gets let through. Can be useful for diagnosing # why a site gets through the filter. on | off logexceptionhits = on # Log File Format # 1 = DansGuardian format 2 = CSV-style format # 3 = Squid Log File Format 4 = Tab delimited logfileformat = 1 # Log file location # # Defines the log directory and filename. #loglocation = '/var/log/dansguardian/access.log' # Network Settings # # the IP that DansGuardian listens on. If left blank DansGuardian will # listen on all IPs. That would include all NICs, loopback, modem, etc. # Normally you would have your firewall protecting this, but if you want # you can limit it to only 1 IP. Yes only one. filterip = # the port that DansGuardian listens to. filterport = 8080 # the ip of the proxy (default is the loopback - i.e. this server) proxyip = 127.0.0.1 # the port DansGuardian connects to proxy on proxyport = 3128 # accessdeniedaddress is the address of your web server to which the cgi # dansguardian reporting script was copied # Do NOT change from the default if you are not using the cgi. # accessdeniedaddress = 'http://YOURSERVER.YOURDOMAIN/cgi-bin/dansguardian.pl' # Non standard delimiter (only used with accessdeniedaddress) # Default is enabled but to go back to the original standard mode dissable it. nonstandarddelimiter = on # Banned image replacement # Images that are banned due to domain/url/etc reasons including those # in the adverts blacklists can be replaced by an image. This will, # for example, hide images from advert sites and remove broken image # icons from banned domains. # 0 = off # 1 = on (default) usecustombannedimage = 1 custombannedimagefile = '/etc/dansguardian/transparent1x1.gif' # Filter groups options # filtergroups sets the number of filter groups. A filter group is a set of content # filtering options you can apply to a group of users. The value must be 1 or more. # DansGuardian will automatically look for dansguardianfN.conf where N is the filter # group. To assign users to groups use the filtergroupslist option. All users default # to filter group 1. You must have some sort of authentication to be able to map users # to a group. The more filter groups the more copies of the lists will be in RAM so # use as few as possible. filtergroups = 1 filtergroupslist = '/etc/dansguardian/filtergroupslist' # Authentication files location bannediplist = '/etc/dansguardian/bannediplist' exceptioniplist = '/etc/dansguardian/exceptioniplist' banneduserlist = '/etc/dansguardian/banneduserlist' exceptionuserlist = '/etc/dansguardian/exceptionuserlist' # Show weighted phrases found # If enabled then the phrases found that made up the total which excedes # the naughtyness limit will be logged and, if the reporting level is # high enough, reported. on | off showweightedfound = on # Weighted phrase mode # There are 3 possible modes of operation: # 0 = off = do not use the weighted phrase feature. # 1 = on, normal = normal weighted phrase operation. # 2 = on, singular = each weighted phrase found only counts once on a page. # weightedphrasemode = 2 # Positive result caching for text URLs # Caches good pages so they don't need to be scanned again # 0 = off (recommended for ISPs with users with disimilar browsing) # 1000 = recommended for most users # 5000 = suggested max upper limit urlcachenumber = # # Age before they are stale and should be ignored in seconds # 0 = never # 900 = recommended = 15 mins urlcacheage = # Smart and Raw phrase content filtering options # Smart is where the multiple spaces and HTML are removed before phrase filtering # Raw is where the raw HTML including meta tags are phrase filtered # CPU usage can be effectively halved by using setting 0 or 1 # 0 = raw only # 1 = smart only # 2 = both (default) phrasefiltermode = 2 # Lower casing options # When a document is scanned the uppercase letters are converted to lower case # in order to compare them with the phrases. However this can break Big5 and # other 16-bit texts. If needed preserve the case. As of version 2.7.0 accented # characters are supported. # 0 = force lower case (default) # 1 = do not change case preservecase = 0 # Hex decoding options # When a document is scanned it can optionally convert %XX to chars. # If you find documents are getting past the phrase filtering due to encoding # then enable. However this can break Big5 and other 16-bit texts. # 0 = disabled (default) # 1 = enabled hexdecodecontent = 0 # Force Quick Search rather than DFA search algorithm # The current DFA implementation is not totally 16-bit character compatible # but is used by default as it handles large phrase lists much faster. # If you wish to use a large number of 16-bit character phrases then # enable this option. # 0 = off (default) # 1 = on (Big5 compatible) forcequicksearch = 0 # Reverse lookups for banned site and URLs. # If set to on, DansGuardian will look up the forward DNS for an IP URL # address and search for both in the banned site and URL lists. This would # prevent a user from simply entering the IP for a banned address. # It will reduce searching speed somewhat so unless you have a local caching # DNS server, leave it off and use the Blanket IP Block option in the # bannedsitelist file instead. reverseaddresslookups = off # Reverse lookups for banned and exception IP lists. # If set to on, DansGuardian will look up the forward DNS for the IP # of the connecting computer. This means you can put in hostnames in # the exceptioniplist and bannediplist. # It will reduce searching speed somewhat so unless you have a local DNS server, # leave it off. reverseclientiplookups = off # Build bannedsitelist and bannedurllist cache files. # This will compare the date stamp of the list file with the date stamp of # the cache file and will recreate as needed. # If a bsl or bul .processed file exists, then that will be used instead. # It will increase process start speed by 300%. On slow computers this will # be significant. Fast computers do not need this option. on | off createlistcachefiles = on # POST protection (web upload and forms) # does not block forms without any file upload, i.e. this is just for # blocking or limiting uploads # measured in kibibytes after MIME encoding and header bumph # use 0 for a complete block # use higher (e.g. 512 = 512Kbytes) for limiting # use -1 for no blocking #maxuploadsize = 512 #maxuploadsize = 0 maxuploadsize = -1 # Max content filter page size # Sometimes web servers label binary files as text which can be very # large which causes a huge drain on memory and cpu resources. # To counter this, you can limit the size of the document to be # filtered and get it to just pass it straight through. # This setting also applies to content regular expression modification. # The size is in Kibibytes - eg 2048 = 2Mb # use 0 for no limit maxcontentfiltersize = # Username identification methods (used in logging) # You can have as many methods as you want and not just one. The first one # will be used then if no username is found, the next will be used. # * proxyauth is for when basic proxy authentication is used (no good for # transparent proxying). # * ntlm is for when the proxy supports the MS NTLM authentication # protocol. (Only works with IE5.5 sp1 and later). **NOT IMPLEMENTED** # * ident is for when the others don't work. It will contact the computer # that the connection came from and try to connect to an identd server # and query it for the user owner of the connection. usernameidmethodproxyauth = on usernameidmethodntlm = off # **NOT IMPLEMENTED** usernameidmethodident = off # Preemptive banning - this means that if you have proxy auth enabled and a user accesses # a site banned by URL for example they will be denied straight away without a request # for their user and pass. This has the effect of requiring the user to visit a clean # site first before it knows who they are and thus maybe an admin user. # This is how DansGuardian has always worked but in some situations it is less than # ideal. So you can optionally disable it. Default is on. # As a side effect disabling this makes AD image replacement work better as the mime # type is know. preemptivebanning = on # Misc settings # if on it adds an X-Forwarded-For: <clientip> to the HTTP request # header. This may help solve some problem sites that need to know the # source ip. on | off forwardedfor = on # if on it uses the X-Forwarded-For: <clientip> to determine the client # IP. This is for when you have squid between the clients and DansGuardian. # Warning - headers are easily spoofed. on | off usexforwardedfor = off # if on it logs some debug info regarding fork()ing and accept()ing which # can usually be ignored. These are logged by syslog. It is safe to leave # it on or off logconnectionhandlingerrors = on # Fork pool options # sets the maximum number of processes to sporn to handle the incomming # connections. Max value usually 250 depending on OS. # On large sites you might want to try 180. maxchildren = 180 # sets the minimum number of processes to sporn to handle the incomming connections. # On large sites you might want to try 32. minchildren = 32 # sets the minimum number of processes to be kept ready to handle connections. # On large sites you might want to try 8. minsparechildren = 8 # sets the minimum number of processes to sporn when it runs out # On large sites you might want to try 10. preforkchildren = 10 # sets the maximum number of processes to have doing nothing. # When this many are spare it will cull some of them. # On large sites you might want to try 64. maxsparechildren = 64 # sets the maximum age of a child process before it croaks it. # This is the number of connections they handle before exiting. # On large sites you might want to try 10000. maxagechildren = 5000 # Process options # (Change these only if you really know what you are doing). # These options allow you to run multiple instances of DansGuardian on a single machine. # Remember to edit the log file path above also if that is your intention. # IPC filename # # Defines IPC server directory and filename used to communicate with the log process. ipcfilename = '/tmp/.dguardianipc' # URL list IPC filename # # Defines URL list IPC server directory and filename used to communicate with the URL # cache process. urlipcfilename = '/tmp/.dguardianurlipc' # PID filename # # Defines process id directory and filename. #pidfilename = '/var/run/dansguardian.pid' # Disable daemoning # If enabled the process will not fork into the background. # It is not usually advantageous to do this. # on|off ( defaults to off ) nodaemon = off # Disable logging process # on|off ( defaults to off ) nologger = off # Daemon runas user and group # This is the user that DansGuardian runs as. Normally the user/group nobody. # Uncomment to use. Defaults to the user set at compile time. # daemonuser = 'nobody' # daemongroup = 'nobody' # Soft restart # When on this disables the forced killing off all processes in the process group. # This is not to be confused with the -g run time option - they are not related. # on|off ( defaults to off ) softrestart = off maxcontentramcachescansize = 2000 maxcontentfilecachescansize = 20000 downloadmanager = '/etc/dansguardian/downloadmanagers/default.conf' authplugin = '/etc/dansguardian/authplugins/proxy-basic.conf' Squid.conf http_port 3128 hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin \? cache deny QUERY acl apache rep_header Server ^Apache #broken_vary_encoding allow apache access_log /squid/var/logs/access.log squid hosts_file /etc/hosts auth_param basic program /squid/libexec/ncsa_auth /squid/etc/userbasic.auth auth_param basic children 5 auth_param basic realm proxy auth_param basic credentialsttl 2 hours auth_param basic casesensitive off refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern . 0 20% 4320 acl NoAuthNec src <HIDDEN FOR SECURITY> acl BrkRm src <HIDDEN FOR SECURITY> acl Dials src <HIDDEN FOR SECURITY> acl Comps src <HIDDEN FOR SECURITY> acl whsws dstdom_regex -i .opensuse.org .novell.com .suse.com mirror.mcs.an1.gov mirrors.kernerl.org www.suse.de suse.mirrors.tds.net mirrros.usc.edu ftp.ale.org suse.cs.utah.edu mirrors.usc.edu mirror.usc.an1.gov linux.nssl.noaa.gov noaa.gov .kernel.org ftp.ale.org ftp.gwdg.de .medibuntu.org mirrors.xmission.com .canonical.com .ubuntu. acl opensites dstdom_regex -i .mbsbooks.com .bowker.com .usps.com .usps.gov .ups.com .fedex.com go.microsoft.com .microsoft.com .apple.com toolbar.msn.com .contacts.msn.com update.services.openoffice.org fms2.pointroll.speedera.net services.wmdrm.windowsmedia.com windowsupdate.com .adobe.com .symantec.com .vitalbook.com vxn1.datawire.net vxn.datawire.net download.lavasoft.de .download.lavasoft.com .lavasoft.com updates.ls-servers.com .canadapost. .myyellow.com minirick symantecliveupdate.com wm.overdrive.com www.overdrive.com productactivation.one.microsoft.com www.update.microsoft.com testdrive.whoson.com www.columbia.k12.mo.us banners.wunderground.com .kofax.com .gotomeeting.com tools.google.com .dl.google.com .cache.googlevideo.com .gpdl.google.com .clients.google.com cache.pack.google.com kh.google.com maps.google.com auth.keyhole.com .contacts.msn.com .hrblock.com .taxcut.com .merchantadvantage.com .jtv.com .malwarebytes.org www.google-analytics.com dcs.support.xerox.com .dhl.com .webtrendslive.com javadl-esd.sun.com javadl-alt.sun.com .excelsior.edu .dhlglobalmail.com .nessus.org .foxitsoftware.com foxit.vo.llnwd.net installshield.com .mindjet.com .mediascouter.com media.us.elsevierhealth.com .xplana.com .govtrack.us sa.tulsacc.edu .omniture.com fpdownload.macromedia.com webservices.amazon.com acl password proxy_auth REQUIRED acl all src all acl manager proto cache_object acl localhost src 127.0.0.1/255.255.255.255 acl to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 563 631 2001 2005 8731 9001 9080 10000 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port # https, snews 443 563 acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port # unregistered ports 1936-65535 acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 10000 acl Safe_ports port 631 acl Safe_ports port 901 # SWAT acl purge method PURGE acl CONNECT method CONNECT acl UTubeUsers proxy_auth "/squid/etc/utubeusers.list" acl RestrictUTube dstdom_regex -i youtube.com acl RestrictFacebook dstdom_regex -i facebook.com acl FacebookUsers proxy_auth "/squid/etc/facebookusers.list" acl BuemerKEC src 10.10.128.0/24 acl MBSsortnet src 10.10.128.0/26 acl MSNExplorer browser -i MSN acl Printers src <HIDDEN FOR SECURITY> acl SpecialFolks src <HIDDEN FOR SECURITY> # streaming download acl fails rep_mime_type ^.*mms.* acl fails rep_mime_type ^.*ms-hdr.* acl fails rep_mime_type ^.*x-fcs.* acl fails rep_mime_type ^.*x-ms-asf.* acl fails2 urlpath_regex dvrplayer mediastream mms:// acl fails2 urlpath_regex \.asf$ \.afx$ \.flv$ \.swf$ acl deny_rep_mime_flashvideo rep_mime_type -i video/flv acl deny_rep_mime_shockwave rep_mime_type -i ^application/x-shockwave-flash$ acl x-type req_mime_type -i ^application/octet-stream$ acl x-type req_mime_type -i application/octet-stream acl x-type req_mime_type -i ^application/x-mplayer2$ acl x-type req_mime_type -i application/x-mplayer2 acl x-type req_mime_type -i ^application/x-oleobject$ acl x-type req_mime_type -i application/x-oleobject acl x-type req_mime_type -i application/x-pncmd acl x-type req_mime_type -i ^video/x-ms-asf$ acl x-type2 rep_mime_type -i ^application/octet-stream$ acl x-type2 rep_mime_type -i application/octet-stream acl x-type2 rep_mime_type -i ^application/x-mplayer2$ acl x-type2 rep_mime_type -i application/x-mplayer2 acl x-type2 rep_mime_type -i ^application/x-oleobject$ acl x-type2 rep_mime_type -i application/x-oleobject acl x-type2 rep_mime_type -i application/x-pncmd acl x-type2 rep_mime_type -i ^video/x-ms-asf$ acl RestrictHulu dstdom_regex -i hulu.com acl broken dstdomain cms.montgomerycollege.edu events.columbiamochamber.com members.columbiamochamber.com public.genexusserver.com acl RestrictVimeo dstdom_regex -i vimeo.com acl http_port port 80 #http_reply_access deny deny_rep_mime_flashvideo #http_reply_access deny deny_rep_mime_shockwave #streaming files #http_access deny fails #http_reply_access deny fails #http_access deny fails2 #http_reply_access deny fails2 #http_access deny x-type #http_reply_access deny x-type #http_access deny x-type2 #http_reply_access deny x-type2 follow_x_forwarded_for allow localhost acl_uses_indirect_client on log_uses_indirect_client on http_access allow manager localhost http_access deny manager http_access allow purge localhost http_access deny purge http_access allow SpecialFolks http_access deny CONNECT !SSL_ports http_access allow whsws http_access allow opensites http_access deny BuemerKEC !MBSsortnet http_access deny BrkRm RestrictUTube RestrictFacebook RestrictVimeo http_access allow RestrictUTube UTubeUsers http_access deny RestrictUTube http_access allow RestrictFacebook FacebookUsers http_access deny RestrictFacebook http_access deny RestrictHulu http_access allow NoAuthNec http_access allow BrkRm http_access allow FacebookUsers RestrictVimeo http_access deny RestrictVimeo http_access allow Comps http_access allow Dials http_access allow Printers http_access allow password http_access deny !Safe_ports http_access deny SSL_ports !CONNECT http_access allow http_port http_access deny all http_reply_access allow all icp_access allow all access_log /squid/var/logs/access.log squid visible_hostname proxy.site.com forwarded_for off coredump_dir /squid/cache/ #header_access Accept-Encoding deny broken #acl snmppublic snmp_community mysecretcommunity #snmp_port 3401 #snmp_access allow snmppublic all cache_mem 3 GB #acl snmppublic snmp_community mbssquid #snmp_port 3401 #snmp_access allow snmppublic all

    Read the article

  • Silverlight - Adding Text to Pushpin in Bing Maps via C#

    - by Morano88
    I was able to make my silverlight Bing map accepts Mousclicks and converts them to Pushpins in C#. Now I want to show a text next to the PushPin as a description that appears when the mouse goes over the pin , I have no clue how to do that. What are the methods that enable me to do this thing? This is the C# code : public partial class MainPage : UserControl { private MapLayer m_PushpinLayer; public MainPage() { InitializeComponent(); base.Loaded += OnLoaded; } private void OnLoaded(object sender, RoutedEventArgs e) { base.Loaded -= OnLoaded; m_PushpinLayer = new MapLayer(); x_Map.Children.Add(m_PushpinLayer); x_Map.MouseClick += OnMouseClick; } private void AddPushpin(double latitude, double longitude) { Pushpin pushpin = new Pushpin(); pushpin.MouseEnter += OnMouseEnter; pushpin.MouseLeave += OnMouseLeave; m_PushpinLayer.AddChild(pushpin, new Location(latitude, longitude), PositionOrigin.BottomCenter); } private void OnMouseClick(object sender, MapMouseEventArgs e) { Point clickLocation = e.ViewportPoint; Location location = x_Map.ViewportPointToLocation(clickLocation); AddPushpin(location.Latitude, location.Longitude); } private void OnMouseLeave(object sender, MouseEventArgs e) { Pushpin pushpin = sender as Pushpin; // remove the pushpin transform when mouse leaves pushpin.RenderTransform = null; } private void OnMouseEnter(object sender, MouseEventArgs e) { Pushpin pushpin = sender as Pushpin; // scaling will shrink (less than 1) or enlarge (greater than 1) source element ScaleTransform st = new ScaleTransform(); st.ScaleX = 1.4; st.ScaleY = 1.4; // set center of scaling to center of pushpin st.CenterX = (pushpin as FrameworkElement).Height / 2; st.CenterY = (pushpin as FrameworkElement).Height / 2; pushpin.RenderTransform = st; } }

    Read the article

  • Autofac Unit Testing using RegisterControllers()

    - by Kane
    I am having problems using Autofac 2.1.13 and writing my unit tests for my ASP.NET MV2 application. I can't seem to resolve controllers when using the RegisterControllers method. I have tried using the Resolve<() and ControllerBuilder.Current.GetControllerFactory().CreateController() methods but to no avail. I am sure that I've missed something simple here so can anyone assist? This was my first attempt at resolving the HomeController - but does not work. ContainerBuilder builder = new ContainerBuilder(); builder.RegisterControllers(typeof(HomeController).Assembly); IContainer container = builder.Build(); // Throws a Throws a A first chance exception of type 'Autofac.Core.Registration.ComponentNotRegisteredException' occurred in Autofac.dll var homeController = container.Resolve<HomeController>(); Similarly this does not work either. ContainerBuilder builder = new ContainerBuilder(); builder.RegisterControllers(typeof(HomeController).Assembly); IContainer container = builder.Build(); var containerProvider = new ContainerProvider(container); ControllerBuilder.Current.SetControllerFactory(new AutofacControllerFactory(containerProvider)); var request = new Mock<HttpRequestBase>(MockBehavior.Loose); request.Setup(r => r.Path).Returns("Path"); var httpContext = new Mock<HttpContextBase>(MockBehavior.Loose); httpContext.SetupGet(c => c.Request).Returns(request.Object); ControllerBuilder.Current.GetControllerFactory().CreateController(new RequestContext(httpContext.Object, new RouteData()), "home"); Any assistance would be greatly appreciated. I should note if I register my controllers without using the RegisterControllers() method my unit tests work. My question would seem to be limited to specifically using the RegisterControllers() method.

    Read the article

  • The endpoint reference (EPR) for the Operation not found is

    - by Denise Wu
    I have been struggling with the following error the last couple of days can you please help! I generated my server and client code using the wsdl2java tool from a wsdl 2.0 file. When invoking the webservice I am getting the following error: org.apache.axis2.AxisFault: The endpoint reference (EPR) for the Operation not found is /axis2/services/MyService/authentication/?username=Denise345&password=xxxxx and the WSA Action = null My service is displayed on the axis2 webpage with all available methods. Here is the output from TcpMon ============== Listen Port: 8090 Target Host: 127.0.0.1 Target Port: 8080 ==== Request ==== GET /axis2/services/MyService/authentication/?username=Denise345&password=xxxxx HTTP/1.1 Content-Type: application/x-www-form-urlencoded; charset=UTF-8 SOAPAction: "" User-Agent: Axis2 Host: 127.0.0.1:8090 ==== Response ==== HTTP/1.1 500 Internal Server Error Server: Apache-Coyote/1.1 Content-Type: application/xml;charset=UTF-8 Transfer-Encoding: chunked Date: Thu, 12 May 2011 15:53:20 GMT Connection: close 12b <soapenv:Reason xmlns:soapenv="http://www.w3.org/2003/05/soap-envelope"> <soapenv:Text xml:lang="en-US">The endpoint reference (EPR) for the Operation not found is /axis2/services/MyService/authentication/?username=Denise345&password=xxxxx and the WSA Action = null</soapenv:Text></soapenv:Reason> 0 ============== I am using: * axis2-1.5.4 * Tomcat 7.0.8 * wsdl 2.0 file Please help!

    Read the article

  • Cannot import the following keyfile: blah.pfx. The keyfile may be password protected.

    - by JasonD
    We just upgraded our Visual Studio 2008 projects to VS2010. All of our assemblies were strong signed using a Verisign code signing certificate. Since the upgrade we continuously get the following error: Cannot import the following key file: companyname.pfx. The key file may be password protected. To correct this, try to import the certificate again or manually install the certificate to the Strong Name CSP with the following key container name: VS_KEY_3E185446540E7F7A This happens on some developer machines and not others. Some methods used to fix this that worked some of the time include: re-installing the key file from Windows Explorer (right click on the PFX file and click Install) installing VS2010 on a fresh machine for the first time prompts you for the password the first time you open the project, and then it works. On machines upgraded from VS2008, you don't get this option. I've tried using the SN.EXE utility to register the key with the Strong Name CSP as the error message suggests, but whenever I run the tool with any options using the version that came with VS2010, SN.EXE just lists its command line arguments instead of doing anything. This happens regardless of what arguments I supply. Does anyone know WHY this is happening, and have clear steps to fix it? I'm about to give up on Click Once installs and Microsoft Code Signing. Thanks for any help!

    Read the article

  • Encryption is hard: AES encryption to Hex

    - by Rob Cameron
    So, I've got an app at work that encrypts a string using ColdFusion. ColdFusion's bulit-in encryption helpers make it pretty simple: encrypt('string_to_encrypt','key','AES','HEX') What I'm trying to do is use Ruby to create the same encrypted string as this ColdFusion script is creating. Unfortunately encryption is the most confusing computer science subject known to man. I found a couple helper methods that use the openssl library and give you a really simple encryption/decryption method. Here's the resulting string: "\370\354D\020\357A\227\377\261G\333\314\204\361\277\250" Which looks unicode-ish to me. I've tried several libraries to convert this to hex but they all say it contains invalid characters. Trying to unpack it results in this: string = "\370\354D\020\357A\227\377\261G\333\314\204\361\277\250" string.unpack('U') ArgumentError: malformed UTF-8 character from (irb):19:in `unpack' from (irb):19 At the end of the day it's supposed to look like this (the output of the ColdFusion encrypt method): F8E91A689565ED24541D2A0109F201EF Of course that's assuming that all the padding, initialization vectors, salts, cypher types and a million other possible differences all line up. Here's the simple script I'm using to encrypt/decrypt: def aes(m,k,t) (aes = OpenSSL::Cipher::Cipher.new('aes-256-cbc').send(m)).key = Digest::SHA256.digest(k) aes.update(t) << aes.final end def encrypt(key, text) aes(:encrypt, key, text) end def decrypt(key, text) aes(:decrypt, key, text) end Any help? Maybe just a simple option I can pass to OpenSSL::Cipher::Cipher that will tell it to hex-encode the final string?

    Read the article

  • c# WinForms ReportViewer Performance issue using RefreshReport() and ServerReport.SetParameters()

    - by mdk
    Hi All, Currently I am writing a c# client application that uses the WinForms ReportViewer Control to display reports from a remote server. I am having performance troubles with the ReportViewer Control, to be specific with the 2 methods reportViewer.ServerReport.SetParameters() and reportViewer.RefreshReport() – they both take a really long time to complete and not just on the very first call, but on each subsequent call as well. SetParameters() takes 20 to 40 seconds (they vary greatly in time, some execute event okay fast) and RefreshReport() is a bit faster but still takes ages. I don’t think the server is the culprit, as the same report viewed using the browser renders pretty fast, about a second tops. The report in question doesn't matter as well. When I break into the process and take a look at the call stack, I see a call to Socket.DoConnect. So I thought that’s a good reason to start using fiddler and I installed it, disabled caching and fired up the app again to see which call takes that long to connect, but the performance issue was gone. By using a proxy I am having the same performance as the webbrowser. FYI: I am using NTLM authentication in the following way: reportViewer.ServerReport.ReportServerCredentials.NetworkCredentials = new NetworkCredentials() { Username = ... } I don’t have a strong webbackground, so I guess my question is: What should this tell me / What should I be looking into? (Btw: Adding fiddler to my installation package is not the solution I am looking for :)) I am grateful for any pointers. Take care, -Martin

    Read the article

  • Why LINQ to Entities won't let me initialize just some properties of an Entity?

    - by emzero
    So I've started to add Entity Framework 4 into a legacy web application (ASP.NET WebForms). As a start I have auto-generated some entities from the database. Also I want to apply Repository Pattern. There is an entity called Visitor and its repository VisitorRepository In VisitorRepository I have the following method: public IEnumerable<Visitor> GetActiveVisitors() { var visitors = from v in _context.Visitors where v.IsActive select new Visitor { VisitorID = v.VisitorID, EmailAddress = v.EmailAddress, RegisterDate = v.RegisterDate, Company = v.Company, Position = v.Position, FirstName = v.FirstName, LastName = v.LastName }; return visitors.ToList(); } That List is then bound to a repeater and when trying to do <%# Eval('EmailAddress') #%> it throws the following exception. The entity or complex type 'Model.Visitor' cannot be constructed in a LINQ to Entities query. A) Why is this happening? How I can workaround this? Do I need to select an anonymous type and then use that to initialize my entities??? B) Why every example I've seen makes use of 'select new' (anonymous object) instead of initializing a known type? Anonymous types are useless unless you are retrieving the data and showing it in the same layer. As far as I know anonymous types cannot be returned from methods? So what's the real point of them??? Thank you all

    Read the article

  • MVC2 DataAnnotations on ViewModel - ModelState.isValid Always Returns true

    - by ScottSEA
    I have an MVC2 Application that uses MVVM pattern. I am trying use Data Annotations to validate form input. In my ThingsController I have two methods: [HttpGet] public ActionResult Index() { return View(); } public ActionResult Details(ThingsViewModel tvm) { if (!ModelState.IsValid) return View(tvm); try { Query q = new Query(tvm.Query); ThingRepository repository = new ThingRepository(q); tvm.Airplanes = repository.All(); return View(tvm); } catch (Exception) { return View(); } } My Details.aspx view is strongly typed to the ThingsViewModel: <%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage<Config.Web.Models.ThingsViewModel>" %> The ViewModel is a class consisting of a IList of returned Thing objects and the Query string (which is submitted on the form) and has the Required data annotation: public class ThingsViewModel { public IList<Thing> Things{ get; set; } [Required(ErrorMessage="You must enter a query")] public string Query { get; set; } } When I run this, and click the submit button on the form without entering a value I get a YSOD with the following error: The model item passed into the dictionary is of type 'Config.Web.Models.ThingsViewModel', but this dictionary requires a model item of type System.Collections.Generic.IEnumerable`1[Config.Domain.Entities.Thing]'. How can I get Data Annotations to work with a ViewModel? I cannot see what I'm missing or where I'm going wrong - the VM was working just fine before I started mucking around with validation.

    Read the article

  • ASPNET MVC - Override Html.TextBoxFor(model.property) with a new helper with same signature?

    - by JK
    I want to override Html.TextBoxFor() with my own helper that has the exact same signature (but a different namespace of course) - is this possible, and if so, how? The reason for this is that I have 100+ views in an already existing app, and I want to change the behaviour of TextBoxFor so that it outputs a maxLength=n attribute if the property has a [StringLength(n)] annotation. The code for automatically outputting maxlength=n is in this question: http://stackoverflow.com/questions/2386365/maxlength-attribute-of-a-text-box-from-the-dataannotations-stringlength-in-mvc2. But my question is not a duplicate - I am trying creating a more generic solution: where the DataAnnotaion flows into the html automatically without any need for additional code by the person writing the view. In the referenced question, you have to change every single Html.TexBoxFor to a Html.CustomTextBoxFor. I need to do it so that the existing TextBoxFor()'s do not need to be changed - hence creating a helper with the same signature: change the behaviour of the helper method, and all existing instances will just work without any changes (100+ views, at least 500 TextBoxFor()s - don't want to manually edit that). I tried this code: (And I need to repeat it for each overload of TextBoxFor, but once the root problem is solved, that will be trivial) namespace My.Helpers { public static class CustomTextBoxHelper { public static MvcHtmlString TextBoxFor<TModel, TProperty>(this HtmlHelper<TModel> htmlHelper, Expression<Func<TModel, TProperty>> expression, object htmlAttributes, bool includeLengthIfAnnotated) { // implementation here } } } But I am getting a compiler error in the view on Html.TextBoxFor(): "The call is ambiguous between the following methods or properties" (of course). Is there any way to do this? Is there an alternative approach that would allow me to change the behaviour of Html.TextBoxFor, so that the views that already use it do not need to be changed?

    Read the article

  • ASP.NET MVC - dropdown list post handling problem

    - by ile
    I've had troubles for a few days already with handling form that contains dropdown list. I tried all that I've learned so far but nothing helps. This is my code: using System; using System.Collections.Generic; using System.Linq; using System.Web; using CMS; using CMS.Model; using System.ComponentModel.DataAnnotations; namespace Portal.Models { public class ArticleDisplay { public ArticleDisplay() { } public int CategoryID { set; get; } public string CategoryTitle { set; get; } public int ArticleID { set; get; } public string ArticleTitle { set; get; } public DateTime ArticleDate; public string ArticleContent { set; get; } } public class HomePageViewModel { public HomePageViewModel(IEnumerable<ArticleDisplay> summaries, Article article) { this.ArticleSummaries = summaries; this.NewArticle = article; } public IEnumerable<ArticleDisplay> ArticleSummaries { get; private set; } public Article NewArticle { get; private set; } } public class ArticleRepository { private DB db = new DB(); // // Query Methods public IQueryable<ArticleDisplay> FindAllArticles() { var result = from category in db.ArticleCategories join article in db.Articles on category.CategoryID equals article.CategoryID orderby article.Date descending select new ArticleDisplay { CategoryID = category.CategoryID, CategoryTitle = category.Title, ArticleID = article.ArticleID, ArticleTitle = article.Title, ArticleDate = article.Date, ArticleContent = article.Content }; return result; } public IQueryable<ArticleDisplay> FindTodayArticles() { var result = from category in db.ArticleCategories join article in db.Articles on category.CategoryID equals article.CategoryID where article.Date == DateTime.Today select new ArticleDisplay { CategoryID = category.CategoryID, CategoryTitle = category.Title, ArticleID = article.ArticleID, ArticleTitle = article.Title, ArticleDate = article.Date, ArticleContent = article.Content }; return result; } public Article GetArticle(int id) { return db.Articles.SingleOrDefault(d => d.ArticleID == id); } public IQueryable<ArticleDisplay> DetailsArticle(int id) { var result = from category in db.ArticleCategories join article in db.Articles on category.CategoryID equals article.CategoryID where id == article.ArticleID select new ArticleDisplay { CategoryID = category.CategoryID, CategoryTitle = category.Title, ArticleID = article.ArticleID, ArticleTitle = article.Title, ArticleDate = article.Date, ArticleContent = article.Content }; return result; } // // Insert/Delete Methods public void Add(Article article) { db.Articles.InsertOnSubmit(article); } public void Delete(Article article) { db.Articles.DeleteOnSubmit(article); } // // Persistence public void Save() { db.SubmitChanges(); } } } using System; using System.Collections.Generic; using System.Linq; using System.Web; using System.Web.Mvc; using Portal.Models; using CMS.Model; namespace Portal.Areas.CMS.Controllers { public class ArticleController : Controller { ArticleRepository articleRepository = new ArticleRepository(); ArticleCategoryRepository articleCategoryRepository = new ArticleCategoryRepository(); // // GET: /Article/ public ActionResult Index() { ViewData["categories"] = new SelectList ( articleCategoryRepository.FindAllCategories().ToList(), "CategoryId", "Title" ); Article article = new Article() { Date = DateTime.Now, CategoryID = 1 }; HomePageViewModel homeData = new HomePageViewModel(articleRepository.FindAllArticles().ToList(), article); return View(homeData); } // // GET: /Article/Details/5 public ActionResult Details(int id) { var article = articleRepository.DetailsArticle(id).Single(); if (article == null) return View("NotFound"); return View(article); } // // GET: /Article/Create //public ActionResult Create() //{ // ViewData["categories"] = new SelectList // ( // articleCategoryRepository.FindAllCategories().ToList(), "CategoryId", "Title" // ); // Article article = new Article() // { // Date = DateTime.Now, // CategoryID = 1 // }; // return View(article); //} // // POST: /Article/Create [ValidateInput(false)] [AcceptVerbs(HttpVerbs.Post)] public ActionResult Create(Article article) { if (ModelState.IsValid) { try { // TODO: Add insert logic here articleRepository.Add(article); articleRepository.Save(); return RedirectToAction("Index"); } catch { return View(article); } } else { return View(article); } } // // GET: /Article/Edit/5 public ActionResult Edit(int id) { ViewData["categories"] = new SelectList ( articleCategoryRepository.FindAllCategories().ToList(), "CategoryId", "Title" ); var article = articleRepository.GetArticle(id); return View(article); } // // POST: /Article/Edit/5 [ValidateInput(false)] [AcceptVerbs(HttpVerbs.Post)] public ActionResult Edit(int id, FormCollection collection) { Article article = articleRepository.GetArticle(id); try { // TODO: Add update logic here UpdateModel(article, collection.ToValueProvider()); articleRepository.Save(); return RedirectToAction("Details", new { id = article.ArticleID }); } catch { return View(article); } } // // HTTP GET: /Article/Delete/1 public ActionResult Delete(int id) { Article article = articleRepository.GetArticle(id); if (article == null) return View("NotFound"); else return View(article); } // // HTTP POST: /Article/Delete/1 [AcceptVerbs(HttpVerbs.Post)] public ActionResult Delete(int id, string confirmButton) { Article article = articleRepository.GetArticle(id); if (article == null) return View("NotFound"); articleRepository.Delete(article); articleRepository.Save(); return View("Deleted"); } [ValidateInput(false)] public ActionResult UpdateSettings(int id, string value, string field) { // This highly-specific example is from the original coder's blog system, // but you can substitute your own code here. I assume you can pick out // which text field it is from the id. Article article = articleRepository.GetArticle(id); if (article == null) return Content("Error"); if (field == "Title") { article.Title = value; UpdateModel(article, new[] { "Title" }); articleRepository.Save(); } if (field == "Content") { article.Content = value; UpdateModel(article, new[] { "Content" }); articleRepository.Save(); } if (field == "Date") { article.Date = Convert.ToDateTime(value); UpdateModel(article, new[] { "Date" }); articleRepository.Save(); } return Content(value); } } } and view: <%@ Page Title="" Language="C#" MasterPageFile="~/Areas/CMS/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage<Portal.Models.HomePageViewModel>" %> <asp:Content ID="Content1" ContentPlaceHolderID="TitleContent" runat="server"> Index </asp:Content> <asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server"> <div class="naslov_poglavlja_main">Articles Administration</div> <%= Html.ValidationSummary("Create was unsuccessful. Please correct the errors and try again.") %> <% using (Html.BeginForm("Create","Article")) {%> <div class="news_forma"> <label for="Title" class="news">Title:</label> <%= Html.TextBox("Title", "", new { @class = "news" })%> <%= Html.ValidationMessage("Title", "*") %> <label for="Content" class="news">Content:</label> <div class="textarea_okvir"> <%= Html.TextArea("Content", "", new { @class = "news" })%> <%= Html.ValidationMessage("Content", "*")%> </div> <label for="CategoryID" class="news">Category:</label> <%= Html.DropDownList("CategoryId", (IEnumerable<SelectListItem>)ViewData["categories"], new { @class = "news" })%> <p> <input type="submit" value="Publish" class="form_submit" /> </p> </div> <% } %> <div class="naslov_poglavlja_main"><%= Html.ActionLink("Write new article...", "Create") %></div> <div id="articles"> <% foreach (var item in Model.ArticleSummaries) { %> <div> <div class="naslov_vijesti" id="<%= item.ArticleID %>"><%= Html.Encode(item.ArticleTitle) %></div> <div class="okvir_vijesti"> <div class="sadrzaj_vijesti" id="<%= item.ArticleID %>"><%= item.ArticleContent %></div> <div class="datum_vijesti" id="<%= item.ArticleID %>"><%= Html.Encode(String.Format("{0:g}", item.ArticleDate)) %></div> <a class="news_delete" href="#" id="<%= item.ArticleID %>">Delete</a> </div> <div class="dno"></div> </div> <% } %> </div> </asp:Content> When trying to post new article I get following error: System.InvalidOperationException: The ViewData item that has the key 'CategoryId' is of type 'System.Int32' but must be of type 'IEnumerable'. I really don't know what to do cause I'm pretty new to .net and mvc Any help appreciated! Ile EDIT: I found where I made mistake. I didn't include date. If in view form I add this line I'm able to add article: <%=Html.Hidden("Date", String.Format("{0:g}", Model.NewArticle.Date)) %> But, if I enter wrong datetype or leave title and content empty then I get the same error. In this eample there is no need for date edit, but I will need it for some other forms and validation will be necessary. EDIT 2: Error happens when posting! Call stack: App_Web_of9beco9.dll!ASP.areas_cms_views_article_create_aspx.__RenderContent2(System.Web.UI.HtmlTextWriter __w = {System.Web.UI.HtmlTextWriter}, System.Web.UI.Control parameterContainer = {System.Web.UI.WebControls.ContentPlaceHolder}) Line 31 + 0x9f bytes C#

    Read the article

  • Problems with jQuery getJSON using local files in Chrome

    - by Tauren
    I have a very simple test page that uses XHR requests with jQuery's $.getJSON and $.ajax methods. The same page works in some situations and not in others. Specificially, it doesn't work in Chrome on Ubuntu. I'm testing on Ubuntu 9.10 with Chrome 5.0.342.7 beta and Mac OSX 10.6.2 with Chrome 5.0.307.9 beta. It works correctly when files are installed on a web server from both Ubuntu/Chrome and Mac/Chrome (try it out here). It works correctly when files are installed on local hard drive in Mac/Chrome (accessed with file:///...). It FAILS when files are installed on local hard drive in Ubuntu/Chrome (access with file:///...). The small set of 3 files can be downloaded in a tar/gzip file from here: http://issues.tauren.com/testjson/testjson.tgz When it works, the Chrome console will say: XHR finished loading: "http://issues.tauren.com/testjson/data.json". index.html:16Using getJSON index.html:21 Object result: "success" __proto__: Object index.html:22success XHR finished loading: "http://issues.tauren.com/testjson/data.json". index.html:29Using ajax with json dataType index.html:34 Object result: "success" __proto__: Object index.html:35success XHR finished loading: "http://issues.tauren.com/testjson/data.json". index.html:46Using ajax with text dataType index.html:51{"result":"success"} index.html:52undefined When it doesn't work, the Chrome console will show this: index.html:16Using getJSON index.html:21null index.html:22Uncaught TypeError: Cannot read property 'result' of null index.html:29Using ajax with json dataType index.html:34null index.html:35Uncaught TypeError: Cannot read property 'result' of null index.html:46Using ajax with text dataType index.html:51 index.html:52undefined Notice that it doesn't even show the XHR requests, although the success handler is run. I swear this was working previously in Ubuntu/Chrome, and am worried something got messed up. I already uninstalled and reinstalled Chrome, but that didn't help. Can someone try it out locally on your Ubuntu system and tell me if you have any troubles? Note that it seems to be working fine in Firefox.

    Read the article

  • Android Canvas Coordinate System

    - by Mitch
    I'm trying to find information on how to change the coordinate system for the canvas. I have some vector data I'd like to draw to a canvas using things like circles and lines, but the data's coordinate system doesn't match the canvas coordinate system. Is there a way to map the units I'm using to the screen's units? I'm drawing to an ImageView which isn't taking up the entire display. If I have to do my own calculations prior to each drawing call, how to I find the width and height of my ImageView? The getWidth() and getHeight() calls I tried seem to be returning the entire canvas size and not the size of the ImageView which isn't helpful. I see some matrix stuff, is that something that will work for me? I tried to use the "public void scale(float sx, float sy)", but that works more like a pixel level zoom rather than a vector scale function by expanding each pixel. This means if the dimensions are increased to fit the screen, the line thickness is also increased. Update: After some research I'm starting to think there's no way to change coordinate systems to something else. I'll need to map all my coordinates to the screen's pixel coordinates and do so by modifying each vector. The getWidth() and getHeight() seem to be working better for me now. I can say what was wrong, but I suspect I can't use these methods inside the constructor.

    Read the article

< Previous Page | 342 343 344 345 346 347 348 349 350 351 352 353  | Next Page >