Current Affairs JEE Main & Advanced

  (1) Symmetric determinant   A determinant is called symmetric determinant if for its every element \[{{a}_{ij}}\,=\,\,\,{{a}_{ji\,}}\forall \,\,i,\,j\] e.g., \[\left| \,\begin{matrix} a & h & g  \\ h & b & f  \\  g & f & c  \\ \end{matrix}\, \right|\].   (2) Skew-symmetric determinant : A determinant is called skew symmetric determinant if for its every element \[{{a}_{ij}}\,=\,-\,{{a}_{ji\,\,}}\forall \,i,\,j\] e.g.,  \[\left| \,\begin{matrix} 0 & 3 & -1  \\ -3 & 0 & 5  \\ 1 & -5 & 0  \\ \end{matrix}\, \right|\]  
  • Every diagonal element of a skew symmetric determinant is always zero.
 
  • The value of a skew symmetric determinant of even order is always a perfect square and that of odd order is always zero.
    (3) Cyclic order : If elements of the rows (or columns) are in cyclic order. i.e.,  (i)   \[\left| \,\begin{matrix} 1 & a & {{a}^{2}}  \\ 1 & b & {{b}^{2}}  \\ 1 & c & {{c}^{2}}  \\ \end{matrix}\, \right|=(a-b)(b-c)(c-a)\]   (ii)  \[\left| \,\begin{matrix} a & b & c  \\ {{a}^{2}} & {{b}^{2}} & {{c}^{2}}  \\ bc & ca & ab  \\ \end{matrix}\, \right|=\left| \,\begin{matrix} 1 & 1 & 1  \\ {{a}^{2}} & {{b}^{2}} & {{c}^{2}}  \\ {{a}^{3}} & {{b}^{3}} & {{c}^{3}}  \\ \end{matrix}\, \right|\]     \[=(a-b)(b-c)(c-a)(ab+bc+ca)\]     (iii) \[\left| \,\begin{matrix} a & bc & abc  \\ b & ca & abc  \\ c & ab & abc  \\ \end{matrix}\, \right|=\left| \,\begin{matrix} a & {{a}^{2}} & {{a}^{3}}  \\ b & {{b}^{2}} & {{b}^{3}}  \\ c & {{c}^{2}} & {{c}^{3}}  \\ \end{matrix}\, \right|=abc(a-b)(b-c)(c-a)\]   (iv) \[\left| \,\begin{matrix} 1 & 1 & 1  \\ a & b & c  \\ {{a}^{3}} & {{b}^{3}} & {{c}^{3}}  \\ \end{matrix}\, \right|=(a-b)(b-c)(c-a)(a+b+c)\]   (v)  \[\left| \,\begin{matrix} a & b & c  \\ b & c & a  \\ c & a & b  \\ \end{matrix}\, \right|=-({{a}^{3}}+{{b}^{3}}+{{c}^{3}}-3abc)\]

  (1) Solution of system of linear equations in three variables by Cramer's rule : The solution of the system of linear equations  \[{{a}_{1}}x+{{b}_{1}}y+{{c}_{1}}z={{d}_{1}}\]                       .....(i)   \[{{a}_{2}}x+{{b}_{2}}y+{{c}_{2}}z={{d}_{2}}\]                       .....(ii)   \[{{a}_{3}}x+{{b}_{3}}y+{{c}_{3}}z={{d}_{3}}\]                       .....(iii)   Is given by \[x=\frac{{{D}_{1}}}{D},\,\,\,\,\,\,y=\frac{{{D}_{2}}}{D}\] and \[z=\frac{{{D}_{3}}}{D}\],   where, \[D=\left| \,\begin{matrix} {{a}_{1}} & {{b}_{1}} & {{c}_{1}}  \\ {{a}_{2}} & {{b}_{2}} & {{c}_{2}}  \\ {{a}_{3}} & {{b}_{3}} & {{c}_{3}}  \\ \end{matrix}\, \right|\,,\]        \[{{D}_{1}}=\left| \,\begin{matrix} {{d}_{1}} & {{b}_{1}} & {{c}_{1}}  \\ {{d}_{2}} & {{b}_{2}} & {{c}_{2}}  \\ {{d}_{3}} & {{b}_{3}} & {{c}_{3}}  \\ \end{matrix}\, \right|\]   \[{{D}_{2}}=\left| \,\begin{matrix} {{a}_{1}} & {{d}_{1}} & {{c}_{1}}  \\ {{a}_{2}} & {{d}_{2}} & {{c}_{2}}  \\ {{a}_{3}} & {{d}_{3}} & {{c}_{3}}  \\ \end{matrix}\, \right|\,,\] and \[{{D}_{3}}=\left| \,\begin{matrix} {{a}_{1}} & {{b}_{1}} & {{d}_{1}}  \\ {{a}_{2}} & {{b}_{2}} & {{d}_{2}}  \\ {{a}_{3}} & {{b}_{3}} & {{d}_{3}}  \\ \end{matrix}\, \right|\]   Provided that \[D\ne 0\]   (2) Conditions for consistency : For a system of 3 simultaneous linear equations in three unknown variable.   (i) If \[D\ne 0\], then the given system of equations is consistent and has a unique solution given by \[x=\frac{{{D}_{1}}}{D},\,\,\,y=\frac{{{D}_{2}}}{D}\] and \[z=\frac{{{D}_{3}}}{D}\]   (ii) If \[D=0\] and \[{{D}_{1}}={{D}_{2}}={{D}_{3}}=0\], then the given system of equations is consistent with infinitely many solutions.   (iii) If \[D=0\] and at least one of the determinants \[{{D}_{1}},\,\,{{D}_{2}},\,\,{{D}_{3}}\] is non-zero, then given of equations is inconsistent.

  (1) Differentiation of a determinant   (i) Let \[\Delta (x)\] be a determinant of order two. If we write \[\Delta (x)=|{{C}_{1}}\,\,\,\,\,{{C}_{2}}|\], where \[{{C}_{1}}\] and \[{{C}_{2}}\] denote the 1st and 2nd columns, then   \[\Delta '(x)=\left| \,\begin{matrix}  C{{'}_{1}} & {{C}_{2}}  \\ \end{matrix} \right|+\left| \,\begin{matrix} {{C}_{1}} & {{{{C}'}}_{2}}  \\ \end{matrix} \right|\]   where \[C{{'}_{i}}\] denotes the column which contains the derivative of all the functions in the \[{{i}^{th}}\]column \[{{C}_{i}}\].   In a similar fashion, if we write \[\Delta (x)=\left| \,\begin{matrix} {{R}_{1}}  \\ {{R}_{2}}  \\ \end{matrix}\, \right|\], then \[{\Delta }'\,(x)=\left| \,\begin{matrix} R{{'}_{1}}  \\ {{R}_{2}}  \\ \end{matrix}\, \right|\,+\,\left| \,\begin{matrix} {{R}_{1}}  \\ {{{{R}'}}_{2}}  \\ \end{matrix}\, \right|\,\]   (ii) Let \[\Delta (x)\] be a determinant of order three. If we write \[\Delta (x)=\left| \,\begin{matrix} {{C}_{1}} & {{C}_{2}} & {{C}_{3}}\,  \\ \end{matrix} \right|\], then     \[\Delta '(x)=\left| \,\begin{matrix} C{{'}_{1}} & {{C}_{2}} & {{C}_{3}}\,  \\ \end{matrix} \right|+\left| \,\begin{matrix} {{C}_{1}} & C{{'}_{2}} & {{C}_{3}}\,  \\ \end{matrix} \right|+\left| \,\begin{matrix} {{C}_{1}} & {{C}_{2}} & C{{'}_{3}}\,  \\ \end{matrix} \right|\]   and similarly if we consider \[\Delta (x)=\left| \,\begin{matrix} {{R}_{1}}  \\ {{R}_{2}}  \\ {{R}_{3}}  \\ \end{matrix}\, \right|\]   Then  \[\Delta '(x)=\left| \,\begin{matrix} R{{'}_{1}}  \\ {{R}_{2}}  \\ {{R}_{3}}  \\ \end{matrix}\, \right|+\left| \,\begin{matrix} {{R}_{1}}  \\ R{{'}_{2}}  \\ {{R}_{3}}  \\ \end{matrix}\, \right|+\left| \,\begin{matrix} {{R}_{1}}  \\ {{R}_{2}}  \\ R{{'}_{3}}  \\ \end{matrix}\, \right|\]   (iii) If only one row (or column) consists functions of \[x\] and other rows (or columns) are constant, viz.   Let \[\Delta (x)=\left| \,\begin{matrix} {{f}_{1}}(x) & {{f}_{2}}(x) & {{f}_{3}}(x)  \\ {{b}_{1}} & {{b}_{2}} & {{b}_{3}}  \\ {{c}_{1}} & {{c}_{2}} & {{c}_{3}}  \\ \end{matrix}\, \right|\],    Then \[\Delta '(x)=\left| \,\begin{matrix} f{{'}_{1}}(x) & f{{'}_{2}}(x) & f{{'}_{3}}(x)  \\ {{b}_{1}} & {{b}_{2}} & {{b}_{3}}  \\ {{c}_{1}} & {{c}_{2}} & {{c}_{3}}  \\ \end{matrix}\, \right|\]   And in general \[{{\Delta }^{n}}(x)=\left| \,\begin{matrix} {{f}_{1}}^{n}(x) & {{f}_{2}}^{n}(x) & {{f}_{3}}^{n}(x)  \\ {{b}_{1}} & {{b}_{2}} & {{b}_{3}}  \\ {{c}_{1}} & {{c}_{2}} & {{c}_{3}}  \\ \end{matrix}\, \right|\]   where \[n\] is any positive integer and \[{{f}^{n}}(x)\] denotes the \[{{n}^{th}}\] derivative of \[f(x)\].   (2) Integration of a determinant   Let \[\Delta (x)=\left| \,\begin{matrix} f(x) & g(x) & h(x)  \\ a & b & c  \\ l & m & n  \\ \end{matrix}\, \right|\], where \[a,\text{ }b,\text{ }c,\text{ }l,\text{ }m\] and \[n\] are constants.   \[\Rightarrow \,\int_{a}^{b}{\Delta (x)dx=\left| \,\begin{matrix} \int_{a}^{b}{f(x)dx} & \int_{a}^{b}{g(x)dx} & \int_{a}^{b}{h(x)dx}  \\ a & b & c  \\ l & m & n  \\ \end{matrix}\, \right|}\]     
  • If the elements of more than one column or rows are functions of \[x\] then the integration can be done only after evaluation/expansion of the determinant.  

 Let the two determinants of third order be,   \[{{D}_{1}}=\left| \,\begin{matrix} {{a}_{1}} & {{b}_{1}} & {{c}_{1}}  \\ {{a}_{2}} & {{b}_{2}} & {{c}_{2}}  \\ {{a}_{3}} & {{b}_{3}} & {{c}_{3}}  \\ \end{matrix} \right|\] and \[{{D}_{2}}=\left| \,\begin{matrix} {{\alpha }_{1}} & {{\beta }_{1}} & {{\gamma }_{1}}  \\ {{\alpha }_{2}} & {{\beta }_{2}} & {{\gamma }_{2}}  \\ {{\alpha }_{3}} & {{\beta }_{3}} & {{\gamma }_{3}}  \\ \end{matrix}\, \right|\].   Let D be their product.   Then \[D=\left| \,\begin{matrix} {{a}_{1}} & {{b}_{1}} & {{c}_{1}}  \\ {{a}_{2}} & {{b}_{2}} & {{c}_{2}}  \\ {{a}_{3}} & {{b}_{3}} & {{c}_{3}}  \\ \end{matrix} \right|\,\,\,\times \left| \,\begin{matrix} {{\alpha }_{1}} & {{\beta }_{1}} & {{\gamma }_{1}}  \\ {{\alpha }_{2}} & {{\beta }_{2}} & {{\gamma }_{2}}  \\ {{\alpha }_{3}} & {{\beta }_{3}} & {{\gamma }_{3}}  \\ \end{matrix}\, \right|\]     \[=\left| \,\begin{matrix} {{a}_{1}}{{\alpha }_{1}}+{{b}_{1}}{{\beta }_{1}}+{{c}_{1}}{{\gamma }_{1}} & {{a}_{1}}{{\alpha }_{2}}+{{b}_{1}}{{\beta }_{2}}+{{c}_{1}}{{\gamma }_{2}} & {{a}_{1}}{{\alpha }_{3}}+{{b}_{1}}{{\beta }_{3}}+{{c}_{1}}{{\gamma }_{3}}  \\ {{a}_{2}}{{\alpha }_{1}}+{{b}_{2}}{{\beta }_{1}}+{{c}_{2}}{{\gamma }_{1}} & {{a}_{2}}{{\alpha }_{2}}+{{b}_{2}}{{\beta }_{2}}+{{c}_{2}}{{\gamma }_{2}} & {{a}_{2}}{{\alpha }_{3}}+{{b}_{2}}{{\beta }_{3}}+{{c}_{2}}{{\gamma }_{3}}  \\ {{a}_{3}}{{\alpha }_{1}}+{{b}_{3}}{{\beta }_{1}}+{{c}_{3}}{{\gamma }_{1}} & {{a}_{3}}{{\alpha }_{2}}+{{b}_{3}}{{\beta }_{2}}+{{c}_{3}}{{\gamma }_{2}} & {{a}_{3}}{{\alpha }_{3}}+{{b}_{3}}{{\beta }_{3}}+{{c}_{3}}{{\gamma }_{3}}  \\ \end{matrix}\, \right|\]     We can also multiply rows by columns or columns by rows or columns by columns.  

  (1) Minor of an element : If we take the element of the determinant and delete (remove) the row and column containing that element, the determinant left is called the minor of that element. It is denoted by \[{{M}_{ij}}\].   Consider the determinant \[\Delta =\left| \,\begin{matrix} {{a}_{11}} & {{a}_{12}} & {{a}_{13}}  \\ {{a}_{21}} & {{a}_{22}} & {{a}_{23}}  \\ {{a}_{31}} & {{a}_{32}} & {{a}_{33}}  \\ \end{matrix}\, \right|\],   then determinant of minors \[M=\left| \,\begin{matrix} {{M}_{11}} & {{M}_{12}} & {{M}_{13}}  \\ {{M}_{21}} & {{M}_{22}} & {{M}_{23}}  \\ {{M}_{31}} & {{M}_{32}} & {{M}_{33}}  \\ \end{matrix}\, \right|\]   where  \[{{M}_{11}}=\] minor of \[{{a}_{11}}=\left| \,\begin{matrix} {{a}_{22}} & {{a}_{23}}  \\ {{a}_{32}} & {{a}_{33}}  \\ \end{matrix}\, \right|\]  \[{{M}_{12}}=\]minor of  \[{{a}_{12}}=\left| \,\begin{matrix} {{a}_{21}} & {{a}_{23}}  \\ {{a}_{31}} & {{a}_{33}}  \\ \end{matrix}\, \right|\] \[{{M}_{13}}=\] minor of \[{{a}_{13}}=\left| \,\begin{matrix} {{a}_{21}} & {{a}_{22}}  \\  {{a}_{31}} & {{a}_{32}}  \\ \end{matrix}\, \right|\]   Similarly, we can find the minors of other elements . Using this concept the value of determinant can be   \[\Delta ={{a}_{11}}{{M}_{11}}-{{a}_{12}}{{M}_{12}}+{{a}_{13}}{{M}_{13}}\]   or, \[\Delta =-{{a}_{21}}{{M}_{21}}+{{a}_{22}}{{M}_{22}}-{{a}_{23}}{{M}_{23}}\]   or,  \[\Delta ={{a}_{31}}{{M}_{31}}-{{a}_{32}}{{M}_{32}}+{{a}_{33}}{{M}_{33}}\].   (2) Cofactor of an element : The cofactor of an element \[{{a}_{ij}}\] (i.e. the element in the \[{{i}^{th}}\] row and \[{{j}^{th}}\] column) is defined as \[{{(-1)}^{i+j}}\] times the minor of that element. It is denoted by \[{{C}_{ij}}\] or \[{{A}_{ij}}\] or \[{{F}_{ij}}\]. \[{{C}_{ij}}={{(-1)}^{i+j}}{{M}_{ij}}\] If \[\Delta =\left| \,\begin{matrix} {{a}_{11}} & {{a}_{12}} & {{a}_{13}}  \\  {{a}_{21}} & {{a}_{22}} & {{a}_{23}}  \\ {{a}_{31}} & {{a}_{32}} & {{a}_{33}}  \\ \end{matrix}\, \right|\], then determinant of cofactors is \[C=\left| \,\begin{matrix} {{C}_{11}} & {{C}_{12}} & {{C}_{13}}  \\ {{C}_{21}} & {{C}_{22}} & {{C}_{23}}  \\ {{C}_{31}} & {{C}_{32}} & {{C}_{33}}  \\ \end{matrix}\, \right|\]   where \[{{C}_{11}}={{(-1)}^{1+1}}{{M}_{11}}=+{{M}_{11}}\], \[{{C}_{12}}={{(-1)}^{1+2}}{{M}_{12}}=-{{M}_{12}}\]  and  \[{{C}_{13}}={{(-1)}^{1+3}}{{M}_{13}}=+{{M}_{13}}\]   Similarly, we can find the cofactors of other elements.

P-1 : The value of determinant remains unchanged, if the rows and the columns are interchanged.   Since the determinant remains unchanged when rows and columns are interchanged, it is obvious that any theorem which is true for ‘rows’ must also be true for ‘columns’.   P-2 : If any two rows (or columns) of a determinant be interchanged, the determinant is unaltered in numerical value but is changed in sign only.   P-3 : If a determinant has two rows (or columns) identical, then its value is zero.   P-4 : If all the elements of any row (or column) be multiplied by the same number, then the value of determinant is multiplied by that number.   P-5 : If each element of any row (or column) can be expressed as a sum of two terms, then the determinant can be expressed as the sum of the determinants.   e.g., \[\left| \,\begin{matrix} {{a}_{1}}+x & {{b}_{1}}+y & {{c}_{1}}+z  \\ {{a}_{2}} & {{b}_{2}} & {{c}_{2}}  \\{{a}_{3}} & {{b}_{3}} & {{c}_{3}}  \\\end{matrix}\, \right|=\left| \,\begin{matrix} {{a}_{1}} & {{b}_{1}} & {{c}_{1}}  \\{{a}_{2}} & {{b}_{2}} & {{c}_{2}}  \\{{a}_{3}} & {{b}_{3}} & {{c}_{3}}  \\\end{matrix}\, \right|+\left| \,\begin{matrix} x & y & z  \\{{a}_{2}} & {{b}_{2}} & {{c}_{2}}  \\{{a}_{3}} & {{b}_{3}} & {{c}_{3}}  \\\end{matrix}\, \right|\]   P-6 : The value of a determinant is not altered by adding to the elements of any row (or column) the same multiples of the corresponding elements of any other row (or column)   e.g.,  \[D=\left| \,\begin{matrix}{{a}_{1}} & {{b}_{1}} & {{c}_{1}}  \\{{a}_{2}} & {{b}_{2}} & {{c}_{2}}  \\{{a}_{3}} & {{b}_{3}} & {{c}_{3}}  \\\end{matrix}\, \right|\]     and  \[D'=\left| \,\begin{matrix} {{a}_{1}}+m{{a}_{2}} & {{b}_{1}}+m{{b}_{2}} & {{c}_{1}}+m{{c}_{2}}  \\ {{a}_{2}} & {{b}_{2}} & {{c}_{2}}  \\ {{a}_{3}}-n{{a}_{1}} & {{b}_{3}}-n{{b}_{1}} & {{c}_{3}}-n{{c}_{1}}  \\\end{matrix}\, \right|\].     Then \[D'=D\].   P-7 : If all elements below leading diagonal or above leading diagonal or except leading diagonal elements are zero then the value of the determinant equal to multiplied of all leading diagonal elements.   P-8 : If a determinant D becomes zero on putting \[x=\alpha \], then we say that \[(x-\alpha )\] is factor of determinant.   P-9 :  It should be noted that while applying operations on determinants then at least one row (or column) must remain unchanged or, Maximum number of operations = order of determinant  –1.   P-10 : It should be noted that if the row (or column) which is changed by multiplied a non-zero number, then the determinant will be divided by that number.

Let us consider three homogeneous linear equations   \[{{a}_{1}}x+{{b}_{1}}y+{{c}_{1}}z=0\],\[{{a}_{2}}x+{{b}_{2}}y+{{c}_{2}}z=0\]   and  \[{{a}_{3}}x+{{b}_{3}}y+{{c}_{3}}z=0\]   Eliminated \[x,\,\,y,\,\,z\] from above three equations we obtain   \[{{a}_{1}}({{b}_{2}}{{c}_{3}}-{{b}_{3}}{{c}_{2}})-{{b}_{1}}({{a}_{2}}{{c}_{3}}-{{a}_{3}}{{c}_{2}})+{{c}_{1}}({{a}_{2}}{{b}_{3}}-{{a}_{3}}{{b}_{2}})=0\]   …..(i)   The L.H.S. of (i) is represented by  \[\left| \,\begin{matrix}{{a}_{1}} & {{b}_{1}} & {{c}_{1}}  \\{{a}_{2}} & {{b}_{2}} & {{c}_{2}}  \\{{a}_{3}} & {{b}_{3}} & {{c}_{3}}  \\\end{matrix}\, \right|={{a}_{1}}\,\left| \,\begin{matrix}{{b}_{2}} & {{c}_{2}}  \\{{b}_{3}} & {{c}_{3}}  \\\end{matrix}\, \right|-{{b}_{1}}\,\left| \,\begin{matrix}{{a}_{2}} & {{c}_{2}}  \\{{a}_{3}} & {{c}_{3}}  \\\end{matrix}\, \right|+{{c}_{1}}\,\left| \,\begin{matrix}{{a}_{2}} & {{b}_{2}}  \\{{a}_{3}} & {{b}_{3}}  \\\end{matrix}\, \right|\]   Its contains three rows and three columns, it is called a determinant of third order.   The number of elements in a second order is \[{{2}^{2}}=4\] and the number of elements in a third order determinant is \[{{3}^{2}}=9\].   Rows and columns of a determinant : In a determinant horizontal lines counting from top \[{{1}^{st}},\text{ }{{2}^{nd}},\text{ }{{3}^{rd}},\ldots ..\] respectively known as rows and denoted by \[{{R}_{1}},\,\,{{R}_{2}},\,\,{{R}_{3}},\,\,......\] and vertical lines counting left to right, \[{{1}^{st}},\text{ }{{2}^{nd}},\text{ }{{3}^{rd}},\ldots ..\] respectively known as columns and denoted by \[{{C}_{1}},\,\,{{C}_{2}},\,\,{{C}_{3}},.....\]

  If a function \[f(x)\] is such that,     (i) It is continuous in the closed interval \[[a,b]\]     (ii) It is derivable in the open interval \[(a,\,b)\]     Then there exists at least one value \['c'\] of \[x\] in the open interval \[(a,b)\] such that \[\frac{f(b)-f(a)}{b-a}=f'(c)\].  

 If \[f(x)\]is such that,   (i) It is continuous in the closed interval  \[[a,\,\,b]\]   (ii) It is derivable in the open interval \[(a,\,b)\]   (iii) \[f(a)=f(b)\]   Then there exists at least one value \['c'\] of \[x\] in the open interval \[(a,\,\,b)\] such that \[f'(c)=0\].  

By maximum (or minimum) or local maximum (or local minimum) value of a function \[f(x)\] at a point \[c\in [a,b]\] we mean the greatest (or the least) value in the immediate neighbourhood of \[x=c\]. It does not mean the greatest or absolute maximum (or the least or absolute minimum) of \[f(x)\]in the interval \[[a,\,b]\].     A function may have a number of local maxima or local minima in a given interval and even a local minimum may be greater than a relative maximum.     Thus a local maximum value may not be the greatest (absolute maximum) and a local minimum value may not be the least (absolute minimum) value of the function in any given interval.     However, if a function \[f(x)\] is continuous on a closed interval \[[a,\,b]\], then it attains the absolute maximum (absolute minimum) at critical points, or at the end points of the interval \[[a,\,b]\]. Thus, to find the absolute maximum (absolute minimum) value of the function, we choose the largest and smallest amongst the numbers \[f(a),f({{c}_{1}}),f({{c}_{2}}),....,f({{c}_{n}}),f(b)\], where \[x={{c}_{1}},{{c}_{2}},....,{{c}_{n}}\] are the critical points.


You need to login to perform this action.
You will be redirected in 3 sec spinner