Search Results

Search found 55276 results on 2212 pages for 'eicar test string'.

Page 106/2212 | < Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >

  • System.NullReferenceException with WhoCanHelpMe Unit Test

    - by tlum
    I'm working with a test project based on WhoCanHelpMe, which is based on Sharp Architecture, NHibernateValidator, etc. As its written the when_the_profile_tasks_is_asked_to_create_a_profile unit test creates the profile object and saves it without issue. Now the profile object is a CreateProfileDetails type that derives from their own ValidatableValueObject which inherits the IValidatable interface. The problem surfaces when my class is based on an Entity rather than their ValidatableValueObject. When the test is run a System.NullReferenceException because Validate() is null. I'm afraid that I'm at a loss to resolve this bad behavior. Does anyone have some suggestions to get to the bottom of this? Thanks, -Ted-

    Read the article

  • Best practices to test protected methods with PHPUnit

    - by GrGr
    Hello, I found the discussion on Do you test private method informative. I have decided, that in some classes, I want to have protected methods, but test them. Some of these methods are static and short. Because most of the public methods make use of them, I will probably be able to safely remove the tests later. But for starting with a TDD approach and avoid debugging, I really want to test them. I thought of the following: Method Object as adviced in an answer seems to be overkill for this. Start with public methods and when code coverage is given by higher level tests, turn them protected and remove the tests. Inherit a class with a testable interface making protected methods public Which is best practice? Is there anything else? It seems, that JUnit automatically changes protected methods to be public, but I did not have a deeper look at it. PHP does not allow this via reflection.

    Read the article

  • Works for Short Input, Fails for Long Input. How to Solve?

    - by r0ach
    I've this program which finds substring in a string. It works for small inputs. But fails for long inputs. Here's the program: //Find Substring in given String #include <stdio.h> #include <string.h> main() { //Variable Initialization int i=0,j=0,k=0; char sentence[50],temp[50],search[50]; //Gets Strings printf("Enter Sentence: "); fgets(sentence,50,stdin); printf("Enter Search: "); fgets(search,50,stdin); //Actual Work Loop while(sentence[i]!='\0') { k=i;j=0; while(sentence[k]==search[j]) { temp[j]=sentence[k]; j++; k++; } if(strcmp(temp,search)==0) break; i++; } //Output Printing printf("Found string at: %d \n",k-strlen(search)); } Works for: Enter Sentence: good evening Enter Search: evening Found string at 6 Fails for: Enter Sentence: dear god please make this work Enter Search: make Found string at 25 Which is totally wrong. Can any expert find me a solution? P.S: This is kinda like reinventing the wheel since strstr() has this functionality. But I'm trying for a non-library way of doing it.

    Read the article

  • How to test that action uses argument?

    - by Caster Troy
    I am supposed to be using test-driven development but in this particular case, as I am having trouble, I implemented the action method first. It looks like this: public ViewResult Index(int pageNumber = 1) { var posts = repository.All(); var model = new PagedList<Post>(posts, pageNumber, PageSize); return View(model); } Both the repository and the PagedList<> have been tested already. Now I want to verify that when the action is given a page number that the page number is actually considered. private Mock<IPostsRepository> repository; private HomeController controller; [Test] public void Index_Doohickey() { var actual = controller.Index(2); // .. How do I test that the controller actually uses the page number here? }

    Read the article

  • Log your SQL in Rails application inside unit test

    - by Phuong Nguy?n
    I want to install a logger so that I can dump all executed SQL of my rails application. Problem is, such logger is associated with AbstractAdapter which initialized very soon under test mode, and thus cannot be set by my initializer code. I try to put ActiveRecord::Base.logger = MyCustomLogger.new(STDOUT) in the end of environment.rb like someone advised but it only works when being run in console environment (kicked by script/console), not when run under test mode. I wonder if there is any way to config such logger so that I will sure to be invoked under any environment (test, development, production, console)

    Read the article

  • unittest in python: ignore an import from the code I want to test

    - by vaidab
    I have a python program that imports pythoncom (and uses pythoncom.CoCreateInstance from it). I want to create a unittest for the program logic without it importing pythoncom (so I can run the test on Linux as well). What options are there? Can I do it without modifying the system under test? What I found so far: sys.modules["pythoncom"] = "test" import module_that_imports_pythoncom My problem with it is if I have: from pythoncom.something import something I'll get: ImportError: No module named something.something And sys.modules["something.something"] or sys.modules["pythoncom.something.something"] doesn't work. Any ideas?

    Read the article

  • Scheduling a visual studio load test using powershell giving me BSOD

    - by user952342
    I have a visual studio load test which I want to run every hour so that I can start to collect some data. To do this, I thought it would be best to make a little powershell script and put a command like this inside: Invoke-Expression -command "& '$env:VS100COMNTOOLS..\IDE\mstest.exe' /testcontainer:"C:\Users\benb\Documents\Visual Studio 2010\Projects\BBPerformanceTest\bin\Debug\HomePageOnly.loadtest"" That command works fine, but sometimes when its run I get a blue screen of death. However, when I run my load test through the visual studio GUI, I never get a BSOD. two questions: is it possible to avoid this BSOD? Is there another way I can schedule my load test? Thanks

    Read the article

  • Throwing special type of exception to terminate unit test

    - by trendl
    Assume I want to write a unit test to test a particular piece of functionality that is implemented within a method. If I wanted to execute the method completely, I would have to do some extra set up work (mock objects expectations etc.). Instead of doing that I use the following approach: - I set up the expectations I'm interested in verifying and then make the tested method throw a special type of exception (e.g. TerminateTestException). - Further down in the unit test I catch the exception and verify the mock object expectations. It works fine but I'm not sure it is good practice. I do not do this regularly, only in cases where it saves me time and effort. One thing that comes to mind as an argument against using this is that throwing exceptions takes long time so the tests execute slower than if I used a different approach.

    Read the article

  • Java Cloud Service Integration to REST Service

    - by Jani Rautiainen
    Service (JCS) provides a platform to develop and deploy business applications in the cloud. In Fusion Applications Cloud deployments customers do not have the option to deploy custom applications developed with JDeveloper to ensure the integrity and supportability of the hosted application service. Instead the custom applications can be deployed to the JCS and integrated to the Fusion Application Cloud instance. This series of articles will go through the features of JCS, provide end-to-end examples on how to develop and deploy applications on JCS and how to integrate them with the Fusion Applications instance. In this article a custom application integrating with REST service will be implemented. We will use REST services provided by Taleo as an example; however the same approach will work with any REST service. In this example the data from the REST service is used to populate a dynamic table. Pre-requisites Access to Cloud instance In order to deploy the application access to a JCS instance is needed, a free trial JCS instance can be obtained from Oracle Cloud site. To register you will need a credit card even if the credit card will not be charged. To register simply click "Try it" and choose the "Java" option. The confirmation email will contain the connection details. See this video for example of the registration.Once the request is processed you will be assigned 2 service instances; Java and Database. Applications deployed to the JCS must use Oracle Database Cloud Service as their underlying database. So when JCS instance is created a database instance is associated with it using a JDBC data source.The cloud services can be monitored and managed through the web UI. For details refer to Getting Started with Oracle Cloud. JDeveloper JDeveloper contains Cloud specific features related to e.g. connection and deployment. To use these features download the JDeveloper from JDeveloper download site by clicking the "Download JDeveloper 11.1.1.7.1 for ADF deployment on Oracle Cloud" link, this version of JDeveloper will have the JCS integration features that will be used in this article. For versions that do not include the Cloud integration features the Oracle Java Cloud Service SDK or the JCS Java Console can be used for deployment. For details on installing and configuring the JDeveloper refer to the installation guideFor details on SDK refer to Using the Command-Line Interface to Monitor Oracle Java Cloud Service and Using the Command-Line Interface to Manage Oracle Java Cloud Service. Access to a local database The database associated with the JCS instance cannot be connected to with JDBC.  Since creating ADFbc business component requires a JDBC connection we will need access to a local database. 3rd party libraries This example will use some 3rd party libraries for implementing the REST service call and processing the input / output content. Other libraries may also be used, however these are tested to work. Jersey 1.x Jersey library will be used as a client to make the call to the REST service. JCS documentation for supported specifications states: Java API for RESTful Web Services (JAX-RS) 1.1 So Jersey 1.x will be used. Download the single-JAR Jersey bundle; in this example Jersey 1.18 JAR bundle is used. Json-simple Jjson-simple library will be used to process the json objects. Download the  JAR file; in this example json-simple-1.1.1.jar is used. Accessing data in Taleo Before implementing the application it is beneficial to familiarize oneself with the data in Taleo. Easiest way to do this is by using a RESTClient on your browser. Once added to the browser you can access the UI: The client can be used to call the REST services to test the URLs and data before adding them into the application. First derive the base URL for the service this can be done with: Method: GET URL: https://tbe.taleo.net/MANAGER/dispatcher/api/v1/serviceUrl/<company name> The response will contain the base URL to be used for the service calls for the company. Next obtain authentication token with: Method: POST URL: https://ch.tbe.taleo.net/CH07/ats/api/v1/login?orgCode=<company>&userName=<user name>&password=<password> The response includes an authentication token that can be used for few hours to authenticate with the service: {   "response": {     "authToken": "webapi26419680747505890557"   },   "status": {     "detail": {},     "success": true   } } To authenticate the service calls navigate to "Headers -> Custom Header": And add a new request header with: Name: Cookie Value: authToken=webapi26419680747505890557 Once authentication token is defined the tool can be used to invoke REST services; for example: Method: GET URL: https://ch.tbe.taleo.net/CH07/ats/api/v1/object/candidate/search.xml?status=16 This data will be used on the application to be created. For details on the Taleo REST services refer to the Taleo Business Edition REST API Guide. Create Application First Fusion Web Application is created and configured. Start JDeveloper and click "New Application": Application Name: JcsRestDemo Application Package Prefix: oracle.apps.jcs.test Application Template: Fusion Web Application (ADF) Configure Local Cloud Connection Follow the steps documented in the "Java Cloud Service ADF Web Application" article to configure a local database connection needed to create the ADFbc objects. Configure Libraries Add the 3rd party libraries into the class path. Create the following directory and copy the jar files into it: <JDEV_USER_HOME>/JcsRestDemo/lib  Select the "Model" project, navigate "Application -> Project Properties -> Libraries and Classpath -> Add JAR / Directory" and add the 2 3rd party libraries: Accessing Data from Taleo To access data from Taleo using the REST service the 3rd party libraries will be used. 2 Java classes are implemented, one representing the Candidate object and another for accessing the Taleo repository Candidate Candidate object is a POJO object used to represent the candidate data obtained from the Taleo repository. The data obtained will be used to populate the ADFbc object used to display the data on the UI. The candidate object contains simply the variables we obtain using the REST services and the getters / setters for them: Navigate "New -> General -> Java -> Java Class", enter "Candidate" as the name and create it in the package "oracle.apps.jcs.test.model".  Copy / paste the following as the content: import oracle.jbo.domain.Number; public class Candidate { private Number candId; private String firstName; private String lastName; public Candidate() { super(); } public Candidate(Number candId, String firstName, String lastName) { super(); this.candId = candId; this.firstName = firstName; this.lastName = lastName; } public void setCandId(Number candId) { this.candId = candId; } public Number getCandId() { return candId; } public void setFirstName(String firstName) { this.firstName = firstName; } public String getFirstName() { return firstName; } public void setLastName(String lastName) { this.lastName = lastName; } public String getLastName() { return lastName; } } Taleo Repository Taleo repository class will interact with the Taleo REST services. The logic will query data from Taleo and populate Candidate objects with the data. The Candidate object will then be used to populate the ADFbc object used to display data on the UI. Navigate "New -> General -> Java -> Java Class", enter "TaleoRepository" as the name and create it in the package "oracle.apps.jcs.test.model".  Copy / paste the following as the content (for details of the implementation refer to the documentation in the code): import com.sun.jersey.api.client.Client; import com.sun.jersey.api.client.ClientResponse; import com.sun.jersey.api.client.WebResource; import com.sun.jersey.core.util.MultivaluedMapImpl; import java.io.StringReader; import java.util.ArrayList; import java.util.Iterator; import java.util.List; import java.util.Map; import javax.ws.rs.core.MediaType; import javax.ws.rs.core.MultivaluedMap; import oracle.jbo.domain.Number; import org.json.simple.JSONArray; import org.json.simple.JSONObject; import org.json.simple.parser.JSONParser; /** * This class interacts with the Taleo REST services */ public class TaleoRepository { /** * Connection information needed to access the Taleo services */ String _company = null; String _userName = null; String _password = null; /** * Jersey client used to access the REST services */ Client _client = null; /** * Parser for processing the JSON objects used as * input / output for the services */ JSONParser _parser = null; /** * The base url for constructing the REST URLs. This is obtained * from Taleo with a service call */ String _baseUrl = null; /** * Authentication token obtained from Taleo using a service call. * The token can be used to authenticate on subsequent * service calls. The token will expire in 4 hours */ String _authToken = null; /** * Static url that can be used to obtain the url used to construct * service calls for a given company */ private static String _taleoUrl = "https://tbe.taleo.net/MANAGER/dispatcher/api/v1/serviceUrl/"; /** * Default constructor for the repository * Authentication details are passed as parameters and used to generate * authentication token. Note that each service call will * generate its own token. This is done to avoid dealing with the expiry * of the token. Also only 20 tokens are allowed per user simultaneously. * So instead for each call there is login / logout. * * @param company the company for which the service calls are made * @param userName the user name to authenticate with * @param password the password to authenticate with. */ public TaleoRepository(String company, String userName, String password) { super(); _company = company; _userName = userName; _password = password; _client = Client.create(); _parser = new JSONParser(); _baseUrl = getBaseUrl(); } /** * This obtains the base url for a company to be used * to construct the urls for service calls * @return base url for the service calls */ private String getBaseUrl() { String result = null; if (null != _baseUrl) { result = _baseUrl; } else { try { String company = _company; WebResource resource = _client.resource(_taleoUrl + company); ClientResponse response = resource.type(MediaType.APPLICATION_FORM_URLENCODED_TYPE).get(ClientResponse.class); String entity = response.getEntity(String.class); JSONObject jsonObject = (JSONObject)_parser.parse(new StringReader(entity)); JSONObject jsonResponse = (JSONObject)jsonObject.get("response"); result = (String)jsonResponse.get("URL"); } catch (Exception ex) { ex.printStackTrace(); } } return result; } /** * Generates authentication token, that can be used to authenticate on * subsequent service calls. Note that each service call will * generate its own token. This is done to avoid dealing with the expiry * of the token. Also only 20 tokens are allowed per user simultaneously. * So instead for each call there is login / logout. * @return authentication token that can be used to authenticate on * subsequent service calls */ private String login() { String result = null; try { MultivaluedMap<String, String> formData = new MultivaluedMapImpl(); formData.add("orgCode", _company); formData.add("userName", _userName); formData.add("password", _password); WebResource resource = _client.resource(_baseUrl + "login"); ClientResponse response = resource.type(MediaType.APPLICATION_FORM_URLENCODED_TYPE).post(ClientResponse.class, formData); String entity = response.getEntity(String.class); JSONObject jsonObject = (JSONObject)_parser.parse(new StringReader(entity)); JSONObject jsonResponse = (JSONObject)jsonObject.get("response"); result = (String)jsonResponse.get("authToken"); } catch (Exception ex) { throw new RuntimeException("Unable to login ", ex); } if (null == result) throw new RuntimeException("Unable to login "); return result; } /** * Releases a authentication token. Each call to login must be followed * by call to logout after the processing is done. This is required as * the tokens are limited to 20 per user and if not released the tokens * will only expire after 4 hours. * @param authToken */ private void logout(String authToken) { WebResource resource = _client.resource(_baseUrl + "logout"); resource.header("cookie", "authToken=" + authToken).post(ClientResponse.class); } /** * This method is used to obtain a list of candidates using a REST * service call. At this example the query is hard coded to query * based on status. The url constructed to access the service is: * <_baseUrl>/object/candidate/search.xml?status=16 * @return List of candidates obtained with the service call */ public List<Candidate> getCandidates() { List<Candidate> result = new ArrayList<Candidate>(); try { // First login, note that in finally block we must have logout _authToken = "authToken=" + login(); /** * Construct the URL, the resulting url will be: * <_baseUrl>/object/candidate/search.xml?status=16 */ MultivaluedMap<String, String> formData = new MultivaluedMapImpl(); formData.add("status", "16"); JSONArray searchResults = (JSONArray)getTaleoResource("object/candidate/search", "searchResults", formData); /** * Process the results, the resulting JSON object is something like * this (simplified for readability): * * { * "response": * { * "searchResults": * [ * { * "candidate": * { * "candId": 211, * "firstName": "Mary", * "lastName": "Stochi", * logic here will find the candidate object(s), obtain the desired * data from them, construct a Candidate object based on the data * and add it to the results. */ for (Object object : searchResults) { JSONObject temp = (JSONObject)object; JSONObject candidate = (JSONObject)findObject(temp, "candidate"); Long candIdTemp = (Long)candidate.get("candId"); Number candId = (null == candIdTemp ? null : new Number(candIdTemp)); String firstName = (String)candidate.get("firstName"); String lastName = (String)candidate.get("lastName"); result.add(new Candidate(candId, firstName, lastName)); } } catch (Exception ex) { ex.printStackTrace(); } finally { if (null != _authToken) logout(_authToken); } return result; } /** * Convenience method to construct url for the service call, invoke the * service and obtain a resource from the response * @param path the path for the service to be invoked. This is combined * with the base url to construct a url for the service * @param resource the key for the object in the response that will be * obtained * @param parameters any parameters used for the service call. The call * is slightly different depending whether parameters exist or not. * @return the resource from the response for the service call */ private Object getTaleoResource(String path, String resource, MultivaluedMap<String, String> parameters) { Object result = null; try { WebResource webResource = _client.resource(_baseUrl + path); ClientResponse response = null; if (null == parameters) response = webResource.header("cookie", _authToken).get(ClientResponse.class); else response = webResource.queryParams(parameters).header("cookie", _authToken).get(ClientResponse.class); String entity = response.getEntity(String.class); JSONObject jsonObject = (JSONObject)_parser.parse(new StringReader(entity)); result = findObject(jsonObject, resource); } catch (Exception ex) { ex.printStackTrace(); } return result; } /** * Convenience method to recursively find a object with an key * traversing down from a given root object. This will traverse a * JSONObject / JSONArray recursively to find a matching key, if found * the object with the key is returned. * @param root root object which contains the key searched for * @param key the key for the object to search for * @return the object matching the key */ private Object findObject(Object root, String key) { Object result = null; if (root instanceof JSONObject) { JSONObject rootJSON = (JSONObject)root; if (rootJSON.containsKey(key)) { result = rootJSON.get(key); } else { Iterator children = rootJSON.entrySet().iterator(); while (children.hasNext()) { Map.Entry entry = (Map.Entry)children.next(); Object child = entry.getValue(); if (child instanceof JSONObject || child instanceof JSONArray) { result = findObject(child, key); if (null != result) break; } } } } else if (root instanceof JSONArray) { JSONArray rootJSON = (JSONArray)root; for (Object child : rootJSON) { if (child instanceof JSONObject || child instanceof JSONArray) { result = findObject(child, key); if (null != result) break; } } } return result; } }   Creating Business Objects While JCS application can be created without a local database, the local database is required when using ADFbc objects even if database objects are not referred. For this example we will create a "Transient" view object that will be programmatically populated based the data obtained from Taleo REST services. Creating ADFbc objects Choose the "Model" project and navigate "New -> Business Tier : ADF Business Components : View Object". On the "Initialize Business Components Project" choose the local database connection created in previous step. On Step 1 enter "JcsRestDemoVO" on the "Name" and choose "Rows populated programmatically, not based on query": On step 2 create the following attributes: CandId Type: Number Updatable: Always Key Attribute: checked Name Type: String Updatable: Always On steps 3 and 4 accept defaults and click "Next".  On step 5 check the "Application Module" checkbox and enter "JcsRestDemoAM" as the name: Click "Finish" to generate the objects. Populating the VO To display the data on the UI the "transient VO" is populated programmatically based on the data obtained from the Taleo REST services. Open the "JcsRestDemoVOImpl.java". Copy / paste the following as the content (for details of the implementation refer to the documentation in the code): import java.sql.ResultSet; import java.util.List; import java.util.ListIterator; import oracle.jbo.server.ViewObjectImpl; import oracle.jbo.server.ViewRowImpl; import oracle.jbo.server.ViewRowSetImpl; // --------------------------------------------------------------------- // --- File generated by Oracle ADF Business Components Design Time. // --- Tue Feb 18 09:40:25 PST 2014 // --- Custom code may be added to this class. // --- Warning: Do not modify method signatures of generated methods. // --------------------------------------------------------------------- public class JcsRestDemoVOImpl extends ViewObjectImpl { /** * This is the default constructor (do not remove). */ public JcsRestDemoVOImpl() { } @Override public void executeQuery() { /** * For some reason we need to reset everything, otherwise * 2nd entry to the UI screen may fail with * "java.util.NoSuchElementException" in createRowFromResultSet * call to "candidates.next()". I am not sure why this is happening * as the Iterator is new and "hasNext" is true at the point * of the execution. My theory is that since the iterator object is * exactly the same the VO cache somehow reuses the iterator including * the pointer that has already exhausted the iterable elements on the * previous run. Working around the issue * here by cleaning out everything on the VO every time before query * is executed on the VO. */ getViewDef().setQuery(null); getViewDef().setSelectClause(null); setQuery(null); this.reset(); this.clearCache(); super.executeQuery(); } /** * executeQueryForCollection - overridden for custom java data source support. */ protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams) { /** * Integrate with the Taleo REST services using TaleoRepository class. * A list of candidates matching a hard coded query is obtained. */ TaleoRepository repository = new TaleoRepository(<company>, <username>, <password>); List<Candidate> candidates = repository.getCandidates(); /** * Store iterator for the candidates as user data on the collection. * This will be used in createRowFromResultSet to create rows based on * the custom iterator. */ ListIterator<Candidate> candidatescIterator = candidates.listIterator(); setUserDataForCollection(qc, candidatescIterator); super.executeQueryForCollection(qc, params, noUserParams); } /** * hasNextForCollection - overridden for custom java data source support. */ protected boolean hasNextForCollection(Object qc) { boolean result = false; /** * Determines whether there are candidates for which to create a row */ ListIterator<Candidate> candidates = (ListIterator<Candidate>)getUserDataForCollection(qc); result = candidates.hasNext(); /** * If all candidates to be created indicate that processing is done */ if (!result) { setFetchCompleteForCollection(qc, true); } return result; } /** * createRowFromResultSet - overridden for custom java data source support. */ protected ViewRowImpl createRowFromResultSet(Object qc, ResultSet resultSet) { /** * Obtain the next candidate from the collection and create a row * for it. */ ListIterator<Candidate> candidates = (ListIterator<Candidate>)getUserDataForCollection(qc); ViewRowImpl row = createNewRowForCollection(qc); try { Candidate candidate = candidates.next(); row.setAttribute("CandId", candidate.getCandId()); row.setAttribute("Name", candidate.getFirstName() + " " + candidate.getLastName()); } catch (Exception e) { e.printStackTrace(); } return row; } /** * getQueryHitCount - overridden for custom java data source support. */ public long getQueryHitCount(ViewRowSetImpl viewRowSet) { /** * For this example this is not implemented rather we always return 0. */ return 0; } } Creating UI Choose the "ViewController" project and navigate "New -> Web Tier : JSF : JSF Page". On the "Create JSF Page" enter "JcsRestDemo" as name and ensure that the "Create as XML document (*.jspx)" is checked.  Open "JcsRestDemo.jspx" and navigate to "Data Controls -> JcsRestDemoAMDataControl -> JcsRestDemoVO1" and drag & drop the VO to the "<af:form> " as a "ADF Read-only Table": Accept the defaults in "Edit Table Columns". To execute the query navigate to to "Data Controls -> JcsRestDemoAMDataControl -> JcsRestDemoVO1 -> Operations -> Execute" and drag & drop the operation to the "<af:form> " as a "Button": Deploying to JCS Follow the same steps as documented in previous article"Java Cloud Service ADF Web Application". Once deployed the application can be accessed with URL: https://java-[identity domain].java.[data center].oraclecloudapps.com/JcsRestDemo-ViewController-context-root/faces/JcsRestDemo.jspx The UI displays a list of candidates obtained from the Taleo REST Services: Summary In this article we learned how to integrate with REST services using Jersey library in JCS. In future articles various other integration techniques will be covered.

    Read the article

  • How to TDD test that objects are being added to a collection if the collection is private?

    - by Joshua Harris
    Assume that I planned to write a class that worked something like this: public class GameCharacter { private Collection<CharacterEffect> _collection; public void Add(CharacterEffect e) { ... } public void Remove(CharacterEffect e) { ... } public void Contains(CharacterEffect e) { ... } } When added an effect does something to the character and is then added to the _collection. When it is removed the effect reverts the change to the character and is removed from the _collection. It's easy to test if the effect was applied to the character, but how do I test that the effect was added to _collection? What test could I write to start constructing this class. I could write a test where Contains would return true for a certain effect being in _collection, but I can't arrange a case where that function would return true because I haven't implemented the Add method that is needed to place things in _collection. Ok, so since Contains is dependent on having Add working, then why don't I try to create Add first. Well for my first test I need to try and figure out if the effect was added to the _collection. How would I do that? The only way to see if an effect is in _collection is with the Contains function. The only way that I could think to test this would be to use a FakeCollection that Mocks the Add, Remove, and Contains of a real collection, but I don't want _collection being affected by outside sources. I don't want to add a setEffects(Collection effects) function, because I do not want the class to have that functionality. The one thing that I am thinking could work is this: public class GameCharacter<C extends Collection> { private Collection<CharacterEffect> _collection; public GameCharacter() { _collection = new C<CharacterEffect>(); } } But, that is just silly making me declare what some private data structures type is on every declaration of the character. Is there a way for me to test this without breaking TDD principles while still allowing me to keep my collection private?

    Read the article

  • Nginx + php-fpm "504 Gateway Time-out" error with almost zero load (on a test-server)

    - by rahul286
    After debugging for 6-hours - I am giving this up :| We have a nginx+php-fpm+mysql in LAN with almost 100 wordpress (created and used by different designers/developers all working on test wordpres setup) We are using nginx without any issues from long. Today, all of a sudden - nginx started returning "504 Gateway Time-out" out of the blue... I checked nginx error log for a virtual host... 2010/09/06 21:24:24 [error] 12909#0: *349 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 192.168.0.1, server: rahul286.rtcamp.info, request: "GET /favicon.ico HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "rahul286.rtcamp.info" 2010/09/06 21:25:11 [error] 12909#0: *349 recv() failed (104: Connection reset by peer) while reading response header from upstream, client: 192.168.0.1, server: rahul286.rtcamp.info, request: "GET /favicon.ico HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "rahul286.rtcamp.info" 2010/09/06 21:25:11 [error] 12909#0: *443 recv() failed (104: Connection reset by peer) while reading response header from upstream, client: 192.168.0.1, server: rahul286.rtcamp.info, request: "GET /info.php HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "rahul286.rtcamp.info" 2010/09/06 21:25:12 [error] 12909#0: *443 connect() failed (111: Connection refused) while connecting to upstream, client: 192.168.0.1, server: rahul286.rtcamp.info, request: "GET /favicon.ico HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "rahul286.rtcamp.info" 2010/09/06 22:08:32 [error] 12909#0: *1025 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 192.168.0.1, server: rahul286.rtcamp.info, request: "GET / HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "rahul286.rtcamp.info" 2010/09/06 22:09:33 [error] 12909#0: *1025 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 192.168.0.1, server: rahul286.rtcamp.info, request: "GET /favicon.ico HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "rahul286.rtcamp.info" 2010/09/06 22:09:40 [error] 12909#0: *1064 recv() failed (104: Connection reset by peer) while reading response header from upstream, client: 192.168.0.1, server: rahul286.rtcamp.info, request: "GET /info.php HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "rahul286.rtcamp.info" 2010/09/06 22:09:40 [error] 12909#0: *1064 connect() failed (111: Connection refused) while connecting to upstream, client: 192.168.0.1, server: rahul286.rtcamp.info, request: "GET /favicon.ico HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "rahul286.rtcamp.info" 2010/09/06 22:24:44 [error] 12909#0: *1313 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 192.168.0.1, server: rahul286.rtcamp.info, request: "GET / HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "rahul286.rtcamp.info" 2010/09/06 22:24:53 [error] 12909#0: *1313 recv() failed (104: Connection reset by peer) while reading response header from upstream, client: 192.168.0.1, server: rahul286.rtcamp.info, request: "GET /favicon.ico HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "rahul286.rtcamp.info" As I run php-fpm on port 9000 via TCP mode, I ran "netstat | grep 9000" and noticed something unusual... (Pasting partial output here for ease of read) tcp 9 0 localhost:9000 localhost:36094 CLOSE_WAIT 14269/php5-fpm tcp 0 0 localhost:46664 localhost:9000 FIN_WAIT2 - tcp 1257 0 localhost:9000 localhost:36135 CLOSE_WAIT - tcp 1257 0 localhost:9000 localhost:36125 CLOSE_WAIT - tcp 9 0 localhost:9000 localhost:36102 CLOSE_WAIT 14268/php5-fpm tcp 0 0 localhost:46662 localhost:9000 FIN_WAIT2 - tcp 745 0 localhost:9000 localhost:46644 CLOSE_WAIT - tcp 0 0 localhost:46658 localhost:9000 FIN_WAIT2 - tcp 1265 0 localhost:9000 localhost:46607 CLOSE_WAIT - tcp 0 0 localhost:46672 localhost:9000 ESTABLISHED 12909/nginx: worker tcp 1257 0 localhost:9000 localhost:36119 CLOSE_WAIT - tcp 1265 0 localhost:9000 localhost:46613 CLOSE_WAIT - tcp 0 0 localhost:46646 localhost:9000 FIN_WAIT2 - tcp 1257 0 localhost:9000 localhost:36137 CLOSE_WAIT - tcp 0 0 localhost:46670 localhost:9000 ESTABLISHED 12909/nginx: worker tcp 1265 0 localhost:9000 localhost:46619 CLOSE_WAIT - tcp 1336 0 localhost:9000 localhost:46668 ESTABLISHED - tcp 0 0 localhost:46648 localhost:9000 FIN_WAIT2 - tcp 1336 0 localhost:9000 localhost:46670 ESTABLISHED - tcp 9 0 localhost:9000 localhost:36108 CLOSE_WAIT 14274/php5-fpm tcp 1336 0 localhost:9000 localhost:46684 ESTABLISHED - tcp 0 0 localhost:46674 localhost:9000 ESTABLISHED 12909/nginx: worker tcp 1336 0 localhost:9000 localhost:46666 ESTABLISHED - tcp 1257 0 localhost:9000 localhost:46648 CLOSE_WAIT - tcp 1336 0 localhost:9000 localhost:46678 ESTABLISHED - tcp 0 0 localhost:46668 localhost:9000 ESTABLISHED 12909/nginx: wo There are plenty of "CLOSE_WAIT" & "FIN_WAIT2" pairs as highlighted below (in above output): tcp 1337 0 localhost:9000 localhost:46680 CLOSE_WAIT - tcp 0 0 localhost:46680 localhost:9000 FIN_WAIT2 - Please note port 46680 in above. I enabled mysql slow queries error log, but it didn't work. As of now restarting php5-fpm every minute via a cronjob (see command below) keeping everything running "smoothly" but I hate patchwork and want to solve this... 1 * * * * service php5-fpm restart > /dev/null I searched extensively on Google - got no help. As mentioned, this a test-server in LAN, CPU load is never crossed 0.10 and memory usage is also below 25% (System has 2GB RAM and ubuntu-server installed) So if you find its time-confusing to help me out, please atleast drop a hint. Thanks in advance for help. -Rahul (note - this is reposting of - http://forum.nginx.org/read.php?11,127694) Update: I found answer, which is posted below.

    Read the article

  • Virtual Box Pen Test Lab Set Up

    - by hairyjewbear
    So i'm trying to set up a pen test lab in virtual box on my windows 7 host. I have 3 guest OS's installed: 1.) BackTrack5 2.) Centos 5 Server/Snort (My Snortbox) 3.) Win XP (Unpatched) I have 3 Ethernet adapters created IP'S 192.168.191.1 192.168.127.1 192.168.56.1 My goal is to use BackTrack5 to nmap the Win XP guest and have the snort box sniff the network. I'm new to networking and virtualization and I need help setting up my virtual network to get this to work. What should I do? All help appreciated Centos: Adapter1: NAT Adapter2: Host-only Adapter3: Internal Network Backtrack: Adapter1: Internal Network XP: Adapter1: Internal Network Also take for granted I'm on a University Network with a ridiculous firewall so I need to stay all within the host

    Read the article

  • SMTP server (IIS) is running but can't test it with telnet

    - by NitroxDM
    I have a Windows 2003 web edition server that I can't seem to get the SMTP relay working. BT4 shows port 25 open. When I try use telnet to test it on my desktop I get: Connecting To XXX.XXX.XXX.XXX...Could not open connection to the host, on port 25: Connect failed. From the server I get: Microsoft Telnet> o 127.0.0.1 25 Connecting To 127.0.0.1... Connection to host lost. There isn't anything useful in the logs. Any ideas?

    Read the article

  • Where is Amazon Linux AMI Test Page EC2?

    - by fuzzybee
    I have set up my websites as directories directly under /var/www/html/ and they are working just fine (the websites are mapped to virtual hosts). So, this is mainly out of curiosity for the moment. Furthermore, being able to customise this might bring some benefits in the future e.g. branding the elastic IPs my computer use temporarily. Notes I can always create a index.html page under /var/www/html/ and modify it but that's not my goal here. I can also map the elastic IP address to a directory /var/www/html/default/ and do my stuffs there but that is not also my goal here My goal is the find the Amazon Linux AMI test page I've tried running Linux command to find it but it takes too long obviously

    Read the article

  • Testing home directory scripts by setting $HOME to the location of the test directory

    - by intuited
    I have an interdependent collection of scripts in my ~/bin directory as well as a developed ~/.vim directory and some other libraries and such in other subdirectories. I've been versioning all of this using git, and have realized that it would be potentially very easy and useful to do development and testing of new and existing scripts, vim plugins, etc. using a cloned repo, and then pull the working code into my actual home directory with a merge. The easiest way to do this would seem to be to just change & export $HOME, eg cd ~/testing; git clone ~ home export HOME=~/testing/home cd ~ screen -S testing-home # start vim, write/revise plugins, edit scripts, etc. # test revisions However since I've never tried this before I'm concerned that some programs, environment variables, etc., may end up using my actual home directory instead of the exported one. Is this a viable strategy? Are there just a few outliers that I should be careful about? Is there a much better way to do this sort of thing?

    Read the article

  • how to test lan network speed with ps3

    - by Damon
    I am having a heck of a time trying to get videos streaming to my PS3. I want to know if there is some way to test the speed between my PC and ps3 to see if that is the issue. For some reason when things get really slow between the ps3 and my computer (it's a wired connection through a switch), the wireless internet on completely different computer gets slow. I don't know what's causing what, but I don't know why there should be any correlation between internet speed on a wirelessly connected laptop and the LAN speed between a wired ps3 and computer.

    Read the article

  • I'm looking for a monitor tool to test Jitter, ICMP, Traceroute and other network issues

    - by Elad Dotan
    I'm looking for a monitor tool to test Jitter, ICMP, Traceroute and other network issues. it could be an application I run from my company Network in NY and London or a SAAS service that have a service that can do it for me. I have a problem in my Data Center that I would like to fix. it happens in different times of the day. I want to run the monitor for few days and save the results so we can analyze them. any idea? Thanks!! Elad.

    Read the article

  • How to perform diagnostics (stress test) on HP Smartarray Controller

    - by pepoluan
    At my office, we have a server that we suspect its RAID controller (HP Smartarray) is failing. A cold boot, however, does not indicate anything. Can anyone recommend me a method to stress-test the controller? Symptoms that makes me suspect a failing controller: Disk access getting slower, queue getting longer Running dmesg on the XenServer console I see many messages similar to this one: end_request: I/O error, dev tda, sector 253655584 (the sector number is never the same) When we move the VM to another physical host, we no longer see the above message Running idle (without any running VM), the dmesg no longer emit the above message A search on Google indicated that the above message is most commonly associated with a failing SmartArray controller. How can I be sure that the SmartArray controller is failing?

    Read the article

  • How to test a LDAP connection from a client

    - by FELDAP
    How to check the LDAP connection from a client to server. I'm working on the LDAP authentication and this client desktop needs to authenticate via a LDAP server. I can SSH to the LDAP server using LDAP user but When in desktop login prompt, I can't login. It says Authentication failure. Client machine has Cent OS 6.3 and LDAP server has Cent OS 5.5 LDAP software is Openldap. LDAP servers logs doesn't even show any messages. So, how to test whether the client can successfully connect to LDAP or not.

    Read the article

  • How to test DNS glue record?

    - by Sunnz
    Hello I have just set up a DNS server for my domain example.org with 2 name servers ns1.example.org and ns2.example.org. I have attempted to set up a glue record for ns1 and ns2 at my registrar. It seems to work for now when I do a dig example.org but when I do a whois example.org it lists ns1.example.org and ns2.example.org but not their IP address which should be set up as a glue record. So I am wondering how do I check for the existence of a glue record? Do I do it with whois? I have seen .com and .net whois records that have both the domain name as well as the IP address for the name servers, is .org different? What's the proper way to test this? Thanks.

    Read the article

  • Test tomcat for ssl renegotiation vulnerability

    - by Jim
    How can I test if my server is vulnerable for SSL renegotiation? I tried the following (using OpenSSL 0.9.8j-fips 07 Jan 2009: openssl s_client -connect 10.2.10.54:443 I see it connects, it brings the certificate chain, it shows the server certificate, and last: SSL handshake has read 2275 bytes and written 465 bytes --- New, TLSv1/SSLv3, Cipher is DHE-RSA-AES256-SHA Server public key is 1024 bit Secure Renegotiation IS supported Compression: NONE Expansion: NONE SSL-Session: Protocol : TLSv1 Cipher : DHE-RSA-AES256-SHA Session-ID: 50B4839724D2A1E7C515EB056FF4C0E57211B1D35253412053534C4A20202020 Session-ID-ctx: Master-Key: 7BC673D771D05599272E120D66477D44A2AF4CC83490CB3FDDCF62CB3FE67ECD051D6A3E9F143AE7C1BA39D0BF3510D4 Key-Arg : None Start Time: 1354008417 Timeout : 300 (sec) Verify return code: 21 (unable to verify the first certificate) What does Secure Renegotiation IS supported mean? That SSL renegotiation is allowed? Then I did but did not get an exception or get the certificate again: verify error:num=20:unable to get local issuer certificate verify return:1 verify error:num=27:certificate not trusted verify return:1 verify error:num=21:unable to verify the first certificate verify return:1 HTTP/1.1 200 OK Server: Apache-Coyote/1.1 Content-Type: text/html;charset=ISO-8859-1 Content-Length: 174 Date: Tue, 27 Nov 2012 09:13:14 GMT Connection: close So is the server vulnerable to SSL renegotiation or not?

    Read the article

  • PRNG test suite: bitstream and stream length

    - by Martin Trigaux
    On the NIST website, there is a tool called sts (Statistical Test Suite) that allow us to rest the validity of a pseudo-random number generator based on a stream of bits in input. When running the program, there is two variables I am not sure to understand : the stream length and number of bitstream. Is the stream length the size of the file ? The number of bit inside ? The size of a bitstream ? Are the bitstreams subset of the whole file ? Chosen how ? Let say I have a text file containing 1,000,000 bits in ascii. What should be my arguments ? You can find the user manual here if needed (I didn't find explanation about what are these variables in it). Thank you

    Read the article

  • Stress test a server for simultaneous connection

    - by weston smith
    I am trying to figure out a practical way to stress test a server for 300 to 600 simultaneous connections. Any advice? Thank you everyone for the help. To be more specific (sorry I wasn't before) this is a Flash Media Server on AWS that will be streaming live video. I've been having problems with the video freezing/buffering for everyone and I need to verify if its on the user end, upload end, or server end. I mainly need help with stress testing the server with 300-600 multiple request before going live.

    Read the article

  • External stereo receiver and speakers on desktop PC work during test but no other time

    - by Debbie
    I have a desktop PC running Windows 7. I have connected an external stereo receiver and speakers to it and everything works fine when I test it in the Realtek Audio Sound through the control panel. However, when I play a video or an audio CD, the stereo speakers do not work. I have line in, line out, and speakers connected to the receiver. I have been through the configure speakers portion also. Is there someplace else I need to configure?

    Read the article

  • How Do You Stress-Test Your Hard Drives?

    - by MetaHyperBolic
    When looking for large new drives (= 1 TB) on newegg and the like, I note a number of reviews talking about drives being either D.O.A. or hitting the Click of Death (or even releasing the Magic Smoke) within a week or so of use. A portion of the reviews mention this phenomenon whether the drive in question is Western Digital or Hitachi or whatever. For those of you using Windows, what do you to: 1) Place a large initial stress on the drive to see if it can take it? For how long? 2) Test the drive afterwards (presumably with some sort of S.M.A.R.T. tool or others) to see if any negative changes have been noted? Note: This is one component of a larger plan for both high-availability and backups for my home data.

    Read the article

< Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >