Well, the problem of finding the solution to
$$
y'' + p(x) y' + q(x) y = f(x)
$$
when the solutions $y_1$ and $y_2$ of the homogeneous problem are "known" is in some thinking, the following:
I know $y(x) = c_1 y_1(x) + c_2 y_2(x)$ solves
$$
y'' + p(x) y' + q(x) y = 0.
$$
What if I let $c_1$ and $c_2$ vary, in order to satisfy point by point, the non-homogeneous equation. I propose then $y(x) = c_1(x) y_1(x) + c_2(x) y_2(x)$, and see where it leads me. For that, we calculate
$$
y'(x) = c_1(x) y_1'(x) + c_2(x) y_2'(x) + c_1'(x) y_1(x) + c_2'(x) y_2(x),
$$
and $y''$. Before doing that, let's take a look on $y'(x)$. If we let the terms of the first derivative of $c_1$, and $c_2$ to subsist, then $y''$ will have terms of the second derivative on $c_1$ and $c_2$, and we will achieve nothing in terms of simplicity. What was a second order ODE with one unknown ($y$), will become a second order ODE with two unknowns ($c_1$ and $c_2$), with the lack of a second equation. This is why, in the construction, one asks for
$$
c_1'(x) y_1(x) + c_2'(x) y_2(x) = 0.
$$
If this condition is fulfilled, we achieve one very important thing: to reduce the non-homogeneous second order ODE, to a system of two first order ODEs, which, incidentally, we know how to solve exaclty (modulus integration). This would be one rational explanation in choosing the above condition.
Now back to the problem at hand. When one has initial conditions, the problem is stated in the following way:
$$
y'' + p(x) y' + q(x) y = f(x), \quad y(a) = y_0, \quad y'(a) = y_0', \quad a < x < b.
$$
The easiest way to solve the full problem is to take advantage of the linearity and split it in two:
$$
y_h'' + p(x) y_h' + q(x) y = 0, \quad y_h(a) = y_0, \quad y_h'(a) = y_0',
$$
and
$$
y_p'' + p(x) y_p' + q(x) y = f(x), \quad y_p(a) = 0, \quad y_p'(a) = 0,
$$
and then $y(x) = y_h(x) + y_p(x)$. In other words, we solve the homogeneous ODE with non-homogeneous initial conditions, and then the non-homogeneous ODE with homogeneous initial conditions, treating the non-homogeneities separately.
The homogeneous solution will be
$$
y_h(x) = k_1 y_1(x) + k_2 y_2(x),
$$
where $y_1$ and $y_2$ are the two linearly independent solutions, and $k_1$ and $k_2$ are determined by solving the system
$$
\begin{pmatrix} y_1(a) & y_2(a) \\ y_1'(a) & y_2'(a) \end{pmatrix} \begin{pmatrix} k_1 \\ k_2 \end{pmatrix} = \begin{pmatrix} y_0 \\ y_0' \end{pmatrix}.
$$
For $y_p$, we propose the solution
$$
y_p(x) = c_1(x) y_1(x) + c_2(x) y_2(x)
$$
with the condition $c_1'(x) y_1(x) + c_2(x) y_2(x) = 0$ (explained above). Then, substituting in its ODE, we can determine $c_1(x)$ and $c_2(x)$ by solving the system
$$
\begin{pmatrix} y_1(x) & y_2(x) \\ y_1'(x) & y_2'(x) \end{pmatrix} \begin{pmatrix} c_1'(x) \\ c_2'(x) \end{pmatrix} = \begin{pmatrix} 0 \\ f(x) \end{pmatrix}
$$
and then
\begin{align}
c_1'(x) &= -\frac{y_2(x) f(x)}{W(y_1,y_1)(x)} \\
\\
c_2'(x) &= \frac{y_1(x) f(x)}{W(y_1,y_2)(x)}
\end{align}
where $W(y_1,y_2)(x) = y_1 y_2'- y_1' y_2$ is the Wronskian of $y_1$ and $y_2$.
The key here is that, in order to obtain $c_1$ and $c_2$, one has to integrate. To do this, we need limits of integration. Conveniently, we know the initial conditions for $y_p$; they can be read as
$$
\begin{pmatrix} y_1(a) & y_2(a) \\ y_1'(a) & y_2'(a) \end{pmatrix} \begin{pmatrix} c_1(a) \\ c_2(a) \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix}
$$
This means that $c_1(a) = c_2(a) = 0$. Finally, performing the integration,
$$
\int_{a}^x c_1'(s) ds = c_1(x) = -\int_{a}^x \frac{y_2(s)}{W(s)} f(s) ds
$$
$$
\int_{a}^x c_2'(s) ds = c_2(x) = \int_{a}^x \frac{y_1(s)}{W(s)} f(s) ds
$$
and then
$$
y_p(x) = \int_{a}^x \frac{y_1(s) y _2(x) - y_1(x)y_2(s)}{W(s)} f(s) ds
$$
or
$$
y_p(x) = \int_{a}^{b} g(x,s) f(s) ds
$$
where
$$
g(x,s) = \begin{cases} \frac{y_1(s) y _2(x) - y_1(x)y_2(s)}{W(s)} & \mbox{if }a < s < x \\
0 & \mbox{if } x < s < b \end{cases}
$$
Finally
$$
y(x) = k_1(x) y_1(x) + k_2(x) y_2(x) + \int_a^b g(x,s) f(s) ds.
$$
The function $g(x,s)$ is called Green's function, and is completely associated with the problem
$$
L y = \frac{d^2 y}{d x^2} + p(x) \frac{d y}{d x} + q(x) y = f(x), \quad B y = \begin{pmatrix} y(a) \\ y'(a) \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix}, \quad a < x < b
$$
The Green's functions is some sort of "inverse" of the operator $L$ with boundary conditions $B$.
What happens with boundary conditions on $a$ and $b$?
Well, in this case, the boundary conditions $B$ are of the form
$$
By = \begin{pmatrix} \alpha_1 y(a) + \beta_1 y'(a) + \gamma_1 y(b) + \delta_1 y'(b) \\ \alpha_2 y(a) + \beta_2 y'(a) + \gamma_2 y(b) + \delta_2 y'(b) \end{pmatrix}
$$
and need not to be homogeneous, i.e. $B y = (r_1, r_2)^T$. If the boundary conditions are linearly independent, then the problem is well defined, and we have to solve
$$
L y = f(x), \quad B y = (r_1, r_2)^T, \quad a < x < b.
$$
The case where $\alpha_2 = \beta_1 = \gamma_{1,2} = \delta_{1,2} = 0$ is the well known initial condition problem, solved above. The case $\gamma_1 = \delta_1 = \alpha_2 = \beta_2 = 0$ is called unmixed, while the other is mixed. Examples of unmixed boundary conditions are Dirichlet conditions $y(a) = y(b) = 0$, Neumann conditions $y'(a) = y'(b)$ and Robin conditions $y(a) + \alpha y'(a) = y(b) + \beta y'(b) = 0$. Examples of mixed boundary conditions are periodic $y(a) = y(b)$, $y'(a) = y'(b)$, and antiperiodic $y(a) = -y(b)$, $y'(a) = -y'(b)$.
As with initial conditions, the best way to work is to split the problem in two:
$$
L y_h = 0, \quad B y_h = (r_1, r_2)^T
$$
and
$$
L y_p = f(x), \quad B y_p = (0, 0)^T
$$
Solving for $y_h$ is equivalent for all types of boundary conditions, and I'll ignore this problem (if you have doubts is a good exercise to work out the details yourself).
Unmixed boundary conditions
$$
B y = \begin{pmatrix} B_1 y \\ B_2 y \end{pmatrix} = \begin{pmatrix} \alpha_1 y(a) + \beta_1 y'(a) \\ \gamma_2 y(b) + \delta_2 y'(b) \end{pmatrix} = 0
$$
Let $w_1(x) = h_{11} y_1(x) + h_{22} y_2(x)$, and $w_2(x) = h_{12} y_1(x) + h_{22} y_2(x)$ with $h_{ij}$ constants, such that $B_1 w_1 = 0$ and $B_2 w_2 = 0$. We propose
$$
y_p(x) = c_1(x) w_1(x) + c_2(x) w_2(x)
$$
with the (very natural) condition $c_1'(x) w_1(x) + c_2'(x) w_2(x) = 0$. As in the initial conditions problem, it's easy to see that
\begin{align}
c_1'(x) &= -\frac{w_2(x) f(x)}{W(w_1,w_2)(x)}\\
\\
c_2'(x) &= \frac{w_1(x) f(x)}{W(w_1,w_2)(x)}
\end{align}
where $W(w_1,w_2)(x) = W(x)$ is the Wronskian of $w_1$ and $w_2$. To determine the limits of integration, we simply evaluate the boundary condtions:
\begin{align}
B_1 y_p &= c_1(a) B_1 w_1 + c_2(a) B_1 w_2 = c_2(a) B_1 w_2 = 0\\
B_2 y_p &= c_1(b) B_2 w_1 + c_2(b) B_2 w_2 = c_1(b) B_2 w_1 = 0
\end{align}
This relation tells us that $c_1$ and $c_2$ need to fulfill the conditions $c_2(a) = c_1(b) = 0$, in order to satisfy the boundary conditions. Then
$$
\int_x^b c_1'(s) ds = - c_1(x) = \int_x^b \frac{w_2(s)}{W(s)}f(s) ds
$$
and
$$
\int_a^x c_2(s) ds = c_2(x) = \int_a^x \frac{w_1(s)}{W(s)}f(s) ds
$$
Finally
$$
y_p(x) = \int_x^b \frac{w_1(x)w_2(s)}{W(s)} f(s) ds + \int_a^x \frac{w_2(x) w_1(s)}{W(s)} f(s) ds
$$
or
$$
y_p(s) = \int_a^b g(x,s) f(s) ds
$$
where
$$
g(x,s) = \begin{cases} \frac{w_1(s) w_2(x)}{W(s)} & \mbox{if } a < s < x < b\\
\\
\frac{w_1(x) w_2(s)}{W(s)} & \mbox{if } a < x < s < b\end{cases}
$$
Mixed boundary conditions
The easiest way here would be to use the result above. Let
$$
y_1(x) = c_{11}(x) w_{11}(x) + c_{12}(x) w_{12}(x)
$$
and
$$
y_2(x) = c_{21}(x) w_{21}(x) + c_{22}(x) w_{22}(x)
$$
where $w_{11}$, $w_{12}$, $w_{21}$, $w_{22}$, are chosen in order to fulfill
$$
B_1 y_1 = \begin{pmatrix} \alpha_1 y_1(a) + \beta_1 y_1'(a) \\ \gamma_2 y_1(b) + \delta_2 y_1'(b)\end{pmatrix} = 0
$$
and
$$
B_2 y_2 = \begin{pmatrix} \gamma_1 y_2(b) + \delta_1 y_2'(b) \\ \alpha_2 y_2(a) + \beta_2 y_2'(a)\end{pmatrix} = 0
$$
By construction, let
\begin{align}
y_1(x) &= \frac{1}{2}\int_a^b g_1(x,s) f(s) ds\\
\\
y_2(x) &= \frac{1}{2}\int_a^b g_2(x,s) f(s) ds
\end{align}
where
\begin{align}
g_1(x,s) &= \begin{cases} \frac{w_{11}(s) w_{12}(x)}{W(s)} & \mbox{if } a < s < x < b\\
\\
\frac{w_{11}(x) w_{12}(s)}{W(s)} & \mbox{if } a < x < s < b\end{cases}
\\ \\
g_2(x,s) &= \begin{cases} \frac{w_{21}(s) w_{22}(x)}{W(s)} & \mbox{if } a < s < x < b\\
\\
\frac{w_{21}(x) w_{22}(s)}{W(s)} & \mbox{if } a < x < s < b\end{cases}
\end{align}
Then, taking $y(x) = y_h(x) + y_1(x) + y_2(x)$, where $y_h(x) = k_1 y_1(x) + k_2 y_2(x)$, we have
$$
L y = L y_h + L y_1 + L y_2 = L y_1 + L y_2 = \frac{1}{2}f(x) + \frac{1}{2}f(x) = f(x)
$$
and
\begin{align}
B y &= B y_h + B y_1 + B y_2 \\
&= B y_h + B_1 y_1 + B_1 y_2 + B_2 y_1 + B_2 y_2 \\
&= B y_h + B_2 y_1 + B_1 y_2 = \begin{pmatrix} r_1 \\ r_2 \end{pmatrix}
\end{align}
If we choose $k_1$ and $k_2$ in such way that
$$
B y_h = \begin{pmatrix} r_1 \\ r_2 \end{pmatrix} - B_2 y_1 - B_1 y_2
$$
then
$$
y(x) = y_h(x) + y_1(x) + y_2(x) = y_h(x) + \int_a^b g(x,s) f(s) ds
$$
where
$$
g(x,s) = \begin{cases} \frac{w_{11}(s) w_{12}(x) + w_{21}(s) w_{22}(x)}{2 W(s)} & \mbox{if } a < s < x < b\\
\\
\frac{w_{11}(x) w_{12}(s) + w_{21}(x) w_{22}(s)}{2W(s)} & \mbox{if } a < x < s < b\end{cases}
$$
A very thorough treatment of the construction of Green's functions for second order ODEs can be found in the fantastic book Principles and Techniques of Applied Mathematics, by Bernard Friedman.
Best Answer
t is belong to the homogenous equation : y = c1t + c2te^t the book only give the particular solution part. http://tutorial.math.lamar.edu/Classes/DE/HOVariationOfParam.aspx
maybe... :)