This post discusses some tips on how to optimize performance in C# Unity programming:
1. Uses foreach() as infrequent as possible. foreach(...) is bad in Unity programming, the foreach(...) loop generates 24 bytes of garbage memory. If you are using a array or list object in C#, then you can use for loop to replace foreach loop. If you are using other collection classes such Dictionary and HashSet, then uses GetEnumerator() from the collection object and while loop to replace it.
2. Uses DateTime.UtcNow to replaces DateTime.Now for tasks such as tracking elapsed time and so on. DateTime.Now consumes more computational resources than DateTime.UtcNow.
3. Overrides OnBecomeInvisible() of the MonoBehavior class to implements logics that hide and stop some proccessing of the game object whenever it becomes invisible (Note that the game object should have a Renderer component for this method handler to be invoked during runtime).
4. Create and Destroy object using Instantiate and Destroy during runtime can be expensive and slow down the frame rate. Apart from putting these operation in StartCoroutine, one workaround is creating a set of game objects at the beginning of the game and store them in a collection such as a Queue. Then instead of calling Instantiate during the game play, call the Dequeue() on the Queue to return a copy of the already created game object and invokes its SetActive(true) method. And instead of calling Destroy, call the Enqueue() on the Queue to store the game object back to the queue (Remember to call the game object's SetActive(false)). This can be a performance hit
5. For a very large generic list of objects in stored in a collection, performance will be gained when a number of collections is used to store the objects (each of them store one of its property, for example). In my personal experience, I used to have List<Employee> collection object in Unity game, where the number of Employee stored in the list is close to 9 millions. Trying to add or remove Employee object from this List<Employee> collection becomes extremely slow. In the end, what I did was to designed a generic collection class, CustomizedList, resembling List<Employee>. The Employee class can be something like the following:
class Employee
{
public int Age;
public int YearsOfExperience;
public float Salary;
public string Occupation;
public string ID;
public Employee()
{
ID = Guid.NewGuid();
}
public override int GetHashCode()
{
return ID.GetHashCode();
}
public override bool Equals(Object rhs)
{
if(rhs is Employee)
{
Employee rhs1=rhs as Employee;
return ID=rhs1.ID;
}
return false;
}
}
and my generic class looks something like this (Note that this is a demo class):
class CustomizedList
{
public List<int> Age = new List<int>();
public List<int> YearsOfExperience = new List<int>();
public List<float> Salary = new List<float>();
public List<string> Occupation = new List<string>();
public List<string> ID= new List<string>();
public void Add(Employee e)
{
Age.Add(e.Age);
YearsOfExperience.Add(e.YearsOfExperience);
Salary.Add(e.Salary);
Occupation.Add(e.Occupation);
ID.Add(e.ID);
}
public Employee this[int index]
{
Employee e=new Employee();
e.ID = ID[index];
e.Age=Age[index];
e.Salary =Salary[index];
e.Occupation=Occupation[index];
e.YearsOfExperience=YearsOfExperience[index];
return e;
}
public void RemoveAt(int index)
{
Age.RemoveAt(index);
ID.RemoveAt(index);
Salary.RemoveAt(index);
Occupation.RemoveAt(index);
YearsOfExperience.RemoveAt(index);
}
public int IndexOf(Employee e)
{
return ID.IndexOf(e.ID);
}
public void Remove(Employee e)
{
int index = IndexOf(e);
RemoveAt(index);
}
public int Count
{
get
{
return ID.Count;
}
}
public Employee Iterate()
{
for(int i=0; i < Count; ++i)
{
yield return this[i];
}
}
}
The basic concept is very simple decomposes the Employee objects into its properties, each of which is stored in a separate list when Add(Employee e) method is called. However, CustomizedList has tremendous increase in performance when compare to the usage of List<Employee> when the number of items stored is extremely large. Of course when processing such a large collection, StartCoroutine is required as a time slice method for long processing (please refers to http://czcodezone.blogspot.com/2015/01/unity-use-startcoroutine-to-perform.html).
No comments:
Post a Comment