Face Recognition with OpenCv, NDK and Android





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







2















I was trying to run this code to take a photo and recognize tho photo owner, at first the user clicks the button Train and takes a picture than enter its name, after that he clicks on Recognize button and take a picture if the picture was saved the apps recognize its face if not it shows unknown person for example. The app runs good but when it crashes sometimes with this error:



FATAL EXCEPTION: main
Process: com.facedetection.app, PID: 7442
java.lang.RuntimeException: Unable to stop activity {com.facedetection.app/com.facedetection.app.TrainActivity}: CvException [org.opencv.core.CvException: cv::Exception: OpenCV(4.0.0-pre) E:AssemCoursesopencv-mastermodulescoresrcmatrix.cpp:235: error: (-215:Assertion failed) s >= 0 in function 'setSize'
]
at android.app.ActivityThread.performDestroyActivity(ActivityThread.java:4852)
at android.app.ActivityThread.handleDestroyActivity(ActivityThread.java:4915)
at android.app.ActivityThread.access$1600(ActivityThread.java:211)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1759)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:145)
at android.app.ActivityThread.main(ActivityThread.java:6946)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1404)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1199)
Caused by: CvException [org.opencv.core.CvException: cv::Exception: OpenCV(4.0.0-pre) E:AssemCoursesopencv-mastermodulescoresrcmatrix.cpp:235: error: (-215:Assertion failed) s >= 0 in function 'setSize'
]
at org.opencv.face.FaceRecognizer.train_0(Native Method)
at org.opencv.face.FaceRecognizer.train(FaceRecognizer.java:133)
at com.facedetection.app.TrainActivity.trainfaces(TrainActivity.java:95)
at com.facedetection.app.TrainActivity.onStop(TrainActivity.java:351)
at android.app.Instrumentation.callActivityOnStop(Instrumentation.java:1305)
at android.app.Activity.performStop(Activity.java:6777)
at android.app.ActivityThread.performDestroyActivity(ActivityThread.java:4847)


Here is the classe mentionned in the error logcat:



Class TrainActivity.java:



    import android.Manifest;
import android.annotation.SuppressLint;
import android.content.DialogInterface;
import android.content.pm.ActivityInfo;
import android.content.pm.PackageManager;
import android.graphics.Bitmap;
import android.os.Build;
import android.os.Bundle;
import android.os.Environment;
import android.support.annotation.NonNull;
import android.support.annotation.Nullable;
import android.support.v7.app.AlertDialog;
import android.support.v7.app.AppCompatActivity;
import android.text.InputType;
import android.util.Log;
import android.view.SurfaceView;
import android.view.View;
import android.view.ViewGroup;
import android.view.WindowManager;
import android.widget.AdapterView;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.EditText;
import android.widget.ListView;
import android.widget.TextView;
import android.widget.Toast;

import com.facebook.stetho.Stetho;

import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.MatOfRect;

import org.opencv.core.Rect;
import org.opencv.core.Size;
import org.opencv.face.Face;
import org.opencv.face.FaceRecognizer;
import org.opencv.face.LBPHFaceRecognizer;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.imgproc.Imgproc;
import org.opencv.objdetect.CascadeClassifier;

import java.io.File;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
import java.util.HashSet;
import java.util.List;
import java.util.Locale;
import java.util.Set;
import static org.opencv.objdetect.Objdetect.CASCADE_SCALE_IMAGE;

/**
* Created by Assem Abozaid on 6/2/2018.
*/

public class TrainActivity extends AppCompatActivity implements CameraBridgeViewBase.CvCameraViewListener2 {
private static String TAG = TrainActivity.class.getSimpleName();
private CameraBridgeViewBase openCVCamera;
private Mat rgba,gray;
private CascadeClassifier classifier;
private MatOfRect faces;
private static final int PERMS_REQUEST_CODE = 123;
private ArrayList<Mat> images;
private ArrayList<String> imagesLabels;
private Storage local;
private String uniqueLabels;
FaceRecognizer recognize;
private boolean trainfaces() {
if(images.isEmpty())
return false;
List<Mat> imagesMatrix = new ArrayList<>();
for (int i = 0; i < images.size(); i++)
imagesMatrix.add(images.get(i));
Set<String> uniqueLabelsSet = new HashSet<>(imagesLabels); // Get all unique labels
uniqueLabels = uniqueLabelsSet.toArray(new String[uniqueLabelsSet.size()]); // Convert to String array, so we can read the values from the indices

int classesNumbers = new int[uniqueLabels.length];
for (int i = 0; i < classesNumbers.length; i++)
classesNumbers[i] = i + 1; // Create incrementing list for each unique label starting at 1
int classes = new int[imagesLabels.size()];
for (int i = 0; i < imagesLabels.size(); i++) {
String label = imagesLabels.get(i);
for (int j = 0; j < uniqueLabels.length; j++) {
if (label.equals(uniqueLabels[j])) {
classes[i] = classesNumbers[j]; // Insert corresponding number
break;
}
}
}
Mat vectorClasses = new Mat(classes.length, 1, CvType.CV_32SC1); // CV_32S == int
vectorClasses.put(0, 0, classes); // Copy int array into a vector

recognize = LBPHFaceRecognizer.create(3,8,8,8,200);
recognize.train(imagesMatrix, vectorClasses);
if(SaveImage())
return true;

return false;
}
public void showLabelsDialog() {
Set<String> uniqueLabelsSet = new HashSet<>(imagesLabels); // Get all unique labels
if (!uniqueLabelsSet.isEmpty()) {
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle("Select label:");
builder.setNegativeButton("Cancel", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
dialog.dismiss();
images.remove(images.size()-1);
}
});
builder.setCancelable(false); // Prevent the user from closing the dialog

String uniqueLabels = uniqueLabelsSet.toArray(new String[uniqueLabelsSet.size()]); // Convert to String array for ArrayAdapter
Arrays.sort(uniqueLabels); // Sort labels alphabetically
final ArrayAdapter<String> arrayAdapter = new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1, uniqueLabels) {
@Override
public @NonNull
View getView(int position, @Nullable View convertView, @NonNull ViewGroup parent) {
TextView textView = (TextView) super.getView(position, convertView, parent);
if (getResources().getBoolean(R.bool.isTablet))
textView.setTextSize(20); // Make text slightly bigger on tablets compared to phones
else
textView.setTextSize(18); // Increase text size a little bit
return textView;
}
};
ListView mListView = new ListView(this);
mListView.setAdapter(arrayAdapter); // Set adapter, so the items actually show up
builder.setView(mListView); // Set the ListView

final AlertDialog dialog = builder.show(); // Show dialog and store in final variable, so it can be dismissed by the ListView

mListView.setOnItemClickListener(new AdapterView.OnItemClickListener() {
@Override
public void onItemClick(AdapterView<?> parent, View view, int position, long id) {
dialog.dismiss();
addLabel(arrayAdapter.getItem(position));
Log.i(TAG, "Labels Size "+imagesLabels.size()+"");
}
});
} else {
showEnterLabelDialog();
}

}
private void showEnterLabelDialog() {
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle("Please enter your name:");

final EditText input = new EditText(this);
input.setInputType(InputType.TYPE_CLASS_TEXT);
builder.setView(input);

builder.setPositiveButton("Submit", null); // Set up positive button, but do not provide a listener, so we can check the string before dismissing the dialog
builder.setNegativeButton("Cancel", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
dialog.dismiss();
images.remove(images.size()-1);
}
});
builder.setCancelable(false); // User has to input a name
AlertDialog dialog = builder.create();

// Source: http://stackoverflow.com/a/7636468/2175837
dialog.setOnShowListener(new DialogInterface.OnShowListener() {
@Override
public void onShow(final DialogInterface dialog) {
Button mButton = ((AlertDialog) dialog).getButton(AlertDialog.BUTTON_POSITIVE);
mButton.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
String string = input.getText().toString().trim();
if (!string.isEmpty()) { // Make sure the input is valid
// If input is valid, dismiss the dialog and add the label to the array
dialog.dismiss();
addLabel(string);
}
}
});
}
});
// Show keyboard, so the user can start typing straight away
dialog.getWindow().setSoftInputMode(WindowManager.LayoutParams.SOFT_INPUT_STATE_VISIBLE);

dialog.show();
}
private void addLabel(String string) {
String label = string.substring(0, 1).toUpperCase(Locale.US) + string.substring(1).trim().toLowerCase(Locale.US); // Make sure that the name is always uppercase and rest is lowercase
imagesLabels.add(label); // Add label to list of labels
Log.i(TAG, "Label: " + label);

}
public boolean SaveImage() {
File path = new File(Environment.getExternalStorageDirectory(), "TrainedData");
path.mkdirs();
String filename = "lbph_trained_data.xml";
File file = new File(path, filename);
recognize.save(file.toString());
if(file.exists())
return true;
return false;
}
public void cropedImages(Mat mat) {
Rect rect_Crop=null;
for(Rect face: faces.toArray()) {
rect_Crop = new Rect(face.x, face.y, face.width, face.height);
}
Mat croped = new Mat(mat, rect_Crop);
images.add(croped);
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.train_main);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
Stetho.initializeWithDefaults(this);

if (hasPermissions()){
Toast.makeText(this, "Permission Granted", Toast.LENGTH_SHORT).show();
Log.i(TAG, "Permission Granted Before");

}
else {
requestPerms();
}

openCVCamera = (CameraBridgeViewBase)findViewById(R.id.java_camera_view);
openCVCamera.setCameraIndex(CameraBridgeViewBase.CAMERA_ID_FRONT);
openCVCamera.setVisibility(SurfaceView.VISIBLE);
openCVCamera.setCvCameraViewListener(this);
local = new Storage(this);
Button detect = (Button)findViewById(R.id.take_picture_button);
detect.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if(gray.total() == 0)
Toast.makeText(getApplicationContext(), "Can't Detect Faces", Toast.LENGTH_SHORT).show();
classifier.detectMultiScale(gray,faces,1.1,3,0|CASCADE_SCALE_IMAGE, new Size(30,30));
if(!faces.empty()) {
if(faces.toArray().length > 1)
Toast.makeText(getApplicationContext(), "Mutliple Faces Are not allowed", Toast.LENGTH_SHORT).show();
else {
if(gray.total() == 0) {
Log.i(TAG, "Empty gray image");
return;
}
cropedImages(gray);
showLabelsDialog();
Toast.makeText(getApplicationContext(), "Face Detected", Toast.LENGTH_SHORT).show();
}
}else
Toast.makeText(getApplicationContext(), "Unknown Face", Toast.LENGTH_SHORT).show();
}
});
}
@SuppressLint("WrongConstant")
private boolean hasPermissions(){
int res = 0;
//string array of permissions,
String permissions = new String{Manifest.permission.CAMERA};

for (String perms : permissions){
res = checkCallingOrSelfPermission(perms);
if (!(res == PackageManager.PERMISSION_GRANTED)){
return false;
}
}
return true;
}
private void requestPerms(){
String permissions = new String{Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE,Manifest.permission.READ_EXTERNAL_STORAGE};
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M){
requestPermissions(permissions,PERMS_REQUEST_CODE);

}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String permissions, @NonNull int grantResults) {
boolean allowed = true;
switch (requestCode){
case PERMS_REQUEST_CODE:
for (int res : grantResults){
// if user granted all permissions.
allowed = allowed && (res == PackageManager.PERMISSION_GRANTED);
}
break;
default:
// if user not granted permissions.
allowed = false;
break;
}
if (allowed){
//user granted all permissions we can perform our task.
Log.i(TAG, "Permission has been added");
}
else {
// we will give warning to user that they haven't granted permissions.
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
if (shouldShowRequestPermissionRationale(Manifest.permission.CAMERA) || shouldShowRequestPermissionRationale(Manifest.permission.WRITE_EXTERNAL_STORAGE) ||
shouldShowRequestPermissionRationale(Manifest.permission.READ_EXTERNAL_STORAGE)){
Toast.makeText(this, "Permission Denied.", Toast.LENGTH_SHORT).show();
}
}
}
}
private BaseLoaderCallback callbackLoader = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
switch(status) {
case BaseLoaderCallback.SUCCESS:
faces = new MatOfRect();
openCVCamera.enableView();

images = local.getListMat("images");
imagesLabels = local.getListString("imagesLabels");

break;
default:
super.onManagerConnected(status);
break;
}
}
};

@Override
protected void onPause() {
super.onPause();
if(openCVCamera != null)
openCVCamera.disableView();


}
@Override
protected void onStop() {
super.onStop();
if (images != null && imagesLabels != null) {
local.putListMat("images", images);
local.putListString("imagesLabels", imagesLabels);
Log.i(TAG, "Images have been saved");
if(trainfaces()) {
images.clear();
imagesLabels.clear();
}
}
}
@Override
protected void onDestroy(){
super.onDestroy();
if(openCVCamera != null)
openCVCamera.disableView();
}
@Override
protected void onResume(){
super.onResume();
if(OpenCVLoader.initDebug()) {
Log.i(TAG, "System Library Loaded Successfully");
callbackLoader.onManagerConnected(BaseLoaderCallback.SUCCESS);
} else {
Log.i(TAG, "Unable To Load System Library");
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION, this, callbackLoader);
}
}

@Override
public void onCameraViewStarted(int width, int height) {
rgba = new Mat();
gray = new Mat();
classifier = FileUtils.loadXMLS(this, "lbpcascade_frontalface_improved.xml");
}

@Override
public void onCameraViewStopped() {
rgba.release();
gray.release();
}

@Override
public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
Mat mGrayTmp = inputFrame.gray();
Mat mRgbaTmp = inputFrame.rgba();

int orientation = openCVCamera.getScreenOrientation();
if (openCVCamera.isEmulator()) // Treat emulators as a special case
Core.flip(mRgbaTmp, mRgbaTmp, 1); // Flip along y-axis
else {
switch (orientation) { // RGB image
case ActivityInfo.SCREEN_ORIENTATION_PORTRAIT:
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_PORTRAIT:
Core.flip(mRgbaTmp, mRgbaTmp, 0); // Flip along x-axis
break;
case ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE:
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_LANDSCAPE:
Core.flip(mRgbaTmp, mRgbaTmp, 1); // Flip along y-axis
break;
}
switch (orientation) { // Grayscale image
case ActivityInfo.SCREEN_ORIENTATION_PORTRAIT:
Core.transpose(mGrayTmp, mGrayTmp); // Rotate image
Core.flip(mGrayTmp, mGrayTmp, -1); // Flip along both axis
break;
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_PORTRAIT:
Core.transpose(mGrayTmp, mGrayTmp); // Rotate image
break;
case ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE:
Core.flip(mGrayTmp, mGrayTmp, 1); // Flip along y-axis
break;
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_LANDSCAPE:
Core.flip(mGrayTmp, mGrayTmp, 0); // Flip along x-axis
break;
}
}
gray = mGrayTmp;
rgba = mRgbaTmp;
Imgproc.resize(gray, gray, new Size(200,200.0f/ ((float)gray.width()/ (float)gray.height())));
return rgba;
}
}


The Lines of Errors mentionned in the logcat refers to those java code lines:



In TrainActivity.java:



    recognize = LBPHFaceRecognizer.create(3,8,8,8,200);
recognize.train(imagesMatrix, vectorClasses);


And



     if(trainfaces()) {
images.clear();
imagesLabels.clear();
}


Do you guys have any idea about this problem and how to fix it?



PS: I am a beginner with OpenCv.










share|improve this question


















  • 1





    The error message says that the assertion s >= 0 in function 'setSize' failed, so s must be less than zero. I would take a look at the sizes of imageMatrix and vectorClasses and any pertinent sizes of the objects contained within them.

    – Jonny Henly
    Nov 16 '18 at 18:16













  • imageMatrix is an arraylist of training images, and it takes the size of images that user takes, And classes is an arraylist of images labels and Vectorclasses is transformation of classes to a vector! i don't know where the error is! please check trainfaces() method.

    – Mr.Geek
    Nov 16 '18 at 21:20


















2















I was trying to run this code to take a photo and recognize tho photo owner, at first the user clicks the button Train and takes a picture than enter its name, after that he clicks on Recognize button and take a picture if the picture was saved the apps recognize its face if not it shows unknown person for example. The app runs good but when it crashes sometimes with this error:



FATAL EXCEPTION: main
Process: com.facedetection.app, PID: 7442
java.lang.RuntimeException: Unable to stop activity {com.facedetection.app/com.facedetection.app.TrainActivity}: CvException [org.opencv.core.CvException: cv::Exception: OpenCV(4.0.0-pre) E:AssemCoursesopencv-mastermodulescoresrcmatrix.cpp:235: error: (-215:Assertion failed) s >= 0 in function 'setSize'
]
at android.app.ActivityThread.performDestroyActivity(ActivityThread.java:4852)
at android.app.ActivityThread.handleDestroyActivity(ActivityThread.java:4915)
at android.app.ActivityThread.access$1600(ActivityThread.java:211)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1759)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:145)
at android.app.ActivityThread.main(ActivityThread.java:6946)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1404)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1199)
Caused by: CvException [org.opencv.core.CvException: cv::Exception: OpenCV(4.0.0-pre) E:AssemCoursesopencv-mastermodulescoresrcmatrix.cpp:235: error: (-215:Assertion failed) s >= 0 in function 'setSize'
]
at org.opencv.face.FaceRecognizer.train_0(Native Method)
at org.opencv.face.FaceRecognizer.train(FaceRecognizer.java:133)
at com.facedetection.app.TrainActivity.trainfaces(TrainActivity.java:95)
at com.facedetection.app.TrainActivity.onStop(TrainActivity.java:351)
at android.app.Instrumentation.callActivityOnStop(Instrumentation.java:1305)
at android.app.Activity.performStop(Activity.java:6777)
at android.app.ActivityThread.performDestroyActivity(ActivityThread.java:4847)


Here is the classe mentionned in the error logcat:



Class TrainActivity.java:



    import android.Manifest;
import android.annotation.SuppressLint;
import android.content.DialogInterface;
import android.content.pm.ActivityInfo;
import android.content.pm.PackageManager;
import android.graphics.Bitmap;
import android.os.Build;
import android.os.Bundle;
import android.os.Environment;
import android.support.annotation.NonNull;
import android.support.annotation.Nullable;
import android.support.v7.app.AlertDialog;
import android.support.v7.app.AppCompatActivity;
import android.text.InputType;
import android.util.Log;
import android.view.SurfaceView;
import android.view.View;
import android.view.ViewGroup;
import android.view.WindowManager;
import android.widget.AdapterView;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.EditText;
import android.widget.ListView;
import android.widget.TextView;
import android.widget.Toast;

import com.facebook.stetho.Stetho;

import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.MatOfRect;

import org.opencv.core.Rect;
import org.opencv.core.Size;
import org.opencv.face.Face;
import org.opencv.face.FaceRecognizer;
import org.opencv.face.LBPHFaceRecognizer;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.imgproc.Imgproc;
import org.opencv.objdetect.CascadeClassifier;

import java.io.File;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
import java.util.HashSet;
import java.util.List;
import java.util.Locale;
import java.util.Set;
import static org.opencv.objdetect.Objdetect.CASCADE_SCALE_IMAGE;

/**
* Created by Assem Abozaid on 6/2/2018.
*/

public class TrainActivity extends AppCompatActivity implements CameraBridgeViewBase.CvCameraViewListener2 {
private static String TAG = TrainActivity.class.getSimpleName();
private CameraBridgeViewBase openCVCamera;
private Mat rgba,gray;
private CascadeClassifier classifier;
private MatOfRect faces;
private static final int PERMS_REQUEST_CODE = 123;
private ArrayList<Mat> images;
private ArrayList<String> imagesLabels;
private Storage local;
private String uniqueLabels;
FaceRecognizer recognize;
private boolean trainfaces() {
if(images.isEmpty())
return false;
List<Mat> imagesMatrix = new ArrayList<>();
for (int i = 0; i < images.size(); i++)
imagesMatrix.add(images.get(i));
Set<String> uniqueLabelsSet = new HashSet<>(imagesLabels); // Get all unique labels
uniqueLabels = uniqueLabelsSet.toArray(new String[uniqueLabelsSet.size()]); // Convert to String array, so we can read the values from the indices

int classesNumbers = new int[uniqueLabels.length];
for (int i = 0; i < classesNumbers.length; i++)
classesNumbers[i] = i + 1; // Create incrementing list for each unique label starting at 1
int classes = new int[imagesLabels.size()];
for (int i = 0; i < imagesLabels.size(); i++) {
String label = imagesLabels.get(i);
for (int j = 0; j < uniqueLabels.length; j++) {
if (label.equals(uniqueLabels[j])) {
classes[i] = classesNumbers[j]; // Insert corresponding number
break;
}
}
}
Mat vectorClasses = new Mat(classes.length, 1, CvType.CV_32SC1); // CV_32S == int
vectorClasses.put(0, 0, classes); // Copy int array into a vector

recognize = LBPHFaceRecognizer.create(3,8,8,8,200);
recognize.train(imagesMatrix, vectorClasses);
if(SaveImage())
return true;

return false;
}
public void showLabelsDialog() {
Set<String> uniqueLabelsSet = new HashSet<>(imagesLabels); // Get all unique labels
if (!uniqueLabelsSet.isEmpty()) {
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle("Select label:");
builder.setNegativeButton("Cancel", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
dialog.dismiss();
images.remove(images.size()-1);
}
});
builder.setCancelable(false); // Prevent the user from closing the dialog

String uniqueLabels = uniqueLabelsSet.toArray(new String[uniqueLabelsSet.size()]); // Convert to String array for ArrayAdapter
Arrays.sort(uniqueLabels); // Sort labels alphabetically
final ArrayAdapter<String> arrayAdapter = new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1, uniqueLabels) {
@Override
public @NonNull
View getView(int position, @Nullable View convertView, @NonNull ViewGroup parent) {
TextView textView = (TextView) super.getView(position, convertView, parent);
if (getResources().getBoolean(R.bool.isTablet))
textView.setTextSize(20); // Make text slightly bigger on tablets compared to phones
else
textView.setTextSize(18); // Increase text size a little bit
return textView;
}
};
ListView mListView = new ListView(this);
mListView.setAdapter(arrayAdapter); // Set adapter, so the items actually show up
builder.setView(mListView); // Set the ListView

final AlertDialog dialog = builder.show(); // Show dialog and store in final variable, so it can be dismissed by the ListView

mListView.setOnItemClickListener(new AdapterView.OnItemClickListener() {
@Override
public void onItemClick(AdapterView<?> parent, View view, int position, long id) {
dialog.dismiss();
addLabel(arrayAdapter.getItem(position));
Log.i(TAG, "Labels Size "+imagesLabels.size()+"");
}
});
} else {
showEnterLabelDialog();
}

}
private void showEnterLabelDialog() {
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle("Please enter your name:");

final EditText input = new EditText(this);
input.setInputType(InputType.TYPE_CLASS_TEXT);
builder.setView(input);

builder.setPositiveButton("Submit", null); // Set up positive button, but do not provide a listener, so we can check the string before dismissing the dialog
builder.setNegativeButton("Cancel", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
dialog.dismiss();
images.remove(images.size()-1);
}
});
builder.setCancelable(false); // User has to input a name
AlertDialog dialog = builder.create();

// Source: http://stackoverflow.com/a/7636468/2175837
dialog.setOnShowListener(new DialogInterface.OnShowListener() {
@Override
public void onShow(final DialogInterface dialog) {
Button mButton = ((AlertDialog) dialog).getButton(AlertDialog.BUTTON_POSITIVE);
mButton.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
String string = input.getText().toString().trim();
if (!string.isEmpty()) { // Make sure the input is valid
// If input is valid, dismiss the dialog and add the label to the array
dialog.dismiss();
addLabel(string);
}
}
});
}
});
// Show keyboard, so the user can start typing straight away
dialog.getWindow().setSoftInputMode(WindowManager.LayoutParams.SOFT_INPUT_STATE_VISIBLE);

dialog.show();
}
private void addLabel(String string) {
String label = string.substring(0, 1).toUpperCase(Locale.US) + string.substring(1).trim().toLowerCase(Locale.US); // Make sure that the name is always uppercase and rest is lowercase
imagesLabels.add(label); // Add label to list of labels
Log.i(TAG, "Label: " + label);

}
public boolean SaveImage() {
File path = new File(Environment.getExternalStorageDirectory(), "TrainedData");
path.mkdirs();
String filename = "lbph_trained_data.xml";
File file = new File(path, filename);
recognize.save(file.toString());
if(file.exists())
return true;
return false;
}
public void cropedImages(Mat mat) {
Rect rect_Crop=null;
for(Rect face: faces.toArray()) {
rect_Crop = new Rect(face.x, face.y, face.width, face.height);
}
Mat croped = new Mat(mat, rect_Crop);
images.add(croped);
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.train_main);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
Stetho.initializeWithDefaults(this);

if (hasPermissions()){
Toast.makeText(this, "Permission Granted", Toast.LENGTH_SHORT).show();
Log.i(TAG, "Permission Granted Before");

}
else {
requestPerms();
}

openCVCamera = (CameraBridgeViewBase)findViewById(R.id.java_camera_view);
openCVCamera.setCameraIndex(CameraBridgeViewBase.CAMERA_ID_FRONT);
openCVCamera.setVisibility(SurfaceView.VISIBLE);
openCVCamera.setCvCameraViewListener(this);
local = new Storage(this);
Button detect = (Button)findViewById(R.id.take_picture_button);
detect.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if(gray.total() == 0)
Toast.makeText(getApplicationContext(), "Can't Detect Faces", Toast.LENGTH_SHORT).show();
classifier.detectMultiScale(gray,faces,1.1,3,0|CASCADE_SCALE_IMAGE, new Size(30,30));
if(!faces.empty()) {
if(faces.toArray().length > 1)
Toast.makeText(getApplicationContext(), "Mutliple Faces Are not allowed", Toast.LENGTH_SHORT).show();
else {
if(gray.total() == 0) {
Log.i(TAG, "Empty gray image");
return;
}
cropedImages(gray);
showLabelsDialog();
Toast.makeText(getApplicationContext(), "Face Detected", Toast.LENGTH_SHORT).show();
}
}else
Toast.makeText(getApplicationContext(), "Unknown Face", Toast.LENGTH_SHORT).show();
}
});
}
@SuppressLint("WrongConstant")
private boolean hasPermissions(){
int res = 0;
//string array of permissions,
String permissions = new String{Manifest.permission.CAMERA};

for (String perms : permissions){
res = checkCallingOrSelfPermission(perms);
if (!(res == PackageManager.PERMISSION_GRANTED)){
return false;
}
}
return true;
}
private void requestPerms(){
String permissions = new String{Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE,Manifest.permission.READ_EXTERNAL_STORAGE};
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M){
requestPermissions(permissions,PERMS_REQUEST_CODE);

}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String permissions, @NonNull int grantResults) {
boolean allowed = true;
switch (requestCode){
case PERMS_REQUEST_CODE:
for (int res : grantResults){
// if user granted all permissions.
allowed = allowed && (res == PackageManager.PERMISSION_GRANTED);
}
break;
default:
// if user not granted permissions.
allowed = false;
break;
}
if (allowed){
//user granted all permissions we can perform our task.
Log.i(TAG, "Permission has been added");
}
else {
// we will give warning to user that they haven't granted permissions.
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
if (shouldShowRequestPermissionRationale(Manifest.permission.CAMERA) || shouldShowRequestPermissionRationale(Manifest.permission.WRITE_EXTERNAL_STORAGE) ||
shouldShowRequestPermissionRationale(Manifest.permission.READ_EXTERNAL_STORAGE)){
Toast.makeText(this, "Permission Denied.", Toast.LENGTH_SHORT).show();
}
}
}
}
private BaseLoaderCallback callbackLoader = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
switch(status) {
case BaseLoaderCallback.SUCCESS:
faces = new MatOfRect();
openCVCamera.enableView();

images = local.getListMat("images");
imagesLabels = local.getListString("imagesLabels");

break;
default:
super.onManagerConnected(status);
break;
}
}
};

@Override
protected void onPause() {
super.onPause();
if(openCVCamera != null)
openCVCamera.disableView();


}
@Override
protected void onStop() {
super.onStop();
if (images != null && imagesLabels != null) {
local.putListMat("images", images);
local.putListString("imagesLabels", imagesLabels);
Log.i(TAG, "Images have been saved");
if(trainfaces()) {
images.clear();
imagesLabels.clear();
}
}
}
@Override
protected void onDestroy(){
super.onDestroy();
if(openCVCamera != null)
openCVCamera.disableView();
}
@Override
protected void onResume(){
super.onResume();
if(OpenCVLoader.initDebug()) {
Log.i(TAG, "System Library Loaded Successfully");
callbackLoader.onManagerConnected(BaseLoaderCallback.SUCCESS);
} else {
Log.i(TAG, "Unable To Load System Library");
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION, this, callbackLoader);
}
}

@Override
public void onCameraViewStarted(int width, int height) {
rgba = new Mat();
gray = new Mat();
classifier = FileUtils.loadXMLS(this, "lbpcascade_frontalface_improved.xml");
}

@Override
public void onCameraViewStopped() {
rgba.release();
gray.release();
}

@Override
public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
Mat mGrayTmp = inputFrame.gray();
Mat mRgbaTmp = inputFrame.rgba();

int orientation = openCVCamera.getScreenOrientation();
if (openCVCamera.isEmulator()) // Treat emulators as a special case
Core.flip(mRgbaTmp, mRgbaTmp, 1); // Flip along y-axis
else {
switch (orientation) { // RGB image
case ActivityInfo.SCREEN_ORIENTATION_PORTRAIT:
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_PORTRAIT:
Core.flip(mRgbaTmp, mRgbaTmp, 0); // Flip along x-axis
break;
case ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE:
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_LANDSCAPE:
Core.flip(mRgbaTmp, mRgbaTmp, 1); // Flip along y-axis
break;
}
switch (orientation) { // Grayscale image
case ActivityInfo.SCREEN_ORIENTATION_PORTRAIT:
Core.transpose(mGrayTmp, mGrayTmp); // Rotate image
Core.flip(mGrayTmp, mGrayTmp, -1); // Flip along both axis
break;
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_PORTRAIT:
Core.transpose(mGrayTmp, mGrayTmp); // Rotate image
break;
case ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE:
Core.flip(mGrayTmp, mGrayTmp, 1); // Flip along y-axis
break;
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_LANDSCAPE:
Core.flip(mGrayTmp, mGrayTmp, 0); // Flip along x-axis
break;
}
}
gray = mGrayTmp;
rgba = mRgbaTmp;
Imgproc.resize(gray, gray, new Size(200,200.0f/ ((float)gray.width()/ (float)gray.height())));
return rgba;
}
}


The Lines of Errors mentionned in the logcat refers to those java code lines:



In TrainActivity.java:



    recognize = LBPHFaceRecognizer.create(3,8,8,8,200);
recognize.train(imagesMatrix, vectorClasses);


And



     if(trainfaces()) {
images.clear();
imagesLabels.clear();
}


Do you guys have any idea about this problem and how to fix it?



PS: I am a beginner with OpenCv.










share|improve this question


















  • 1





    The error message says that the assertion s >= 0 in function 'setSize' failed, so s must be less than zero. I would take a look at the sizes of imageMatrix and vectorClasses and any pertinent sizes of the objects contained within them.

    – Jonny Henly
    Nov 16 '18 at 18:16













  • imageMatrix is an arraylist of training images, and it takes the size of images that user takes, And classes is an arraylist of images labels and Vectorclasses is transformation of classes to a vector! i don't know where the error is! please check trainfaces() method.

    – Mr.Geek
    Nov 16 '18 at 21:20














2












2








2


1






I was trying to run this code to take a photo and recognize tho photo owner, at first the user clicks the button Train and takes a picture than enter its name, after that he clicks on Recognize button and take a picture if the picture was saved the apps recognize its face if not it shows unknown person for example. The app runs good but when it crashes sometimes with this error:



FATAL EXCEPTION: main
Process: com.facedetection.app, PID: 7442
java.lang.RuntimeException: Unable to stop activity {com.facedetection.app/com.facedetection.app.TrainActivity}: CvException [org.opencv.core.CvException: cv::Exception: OpenCV(4.0.0-pre) E:AssemCoursesopencv-mastermodulescoresrcmatrix.cpp:235: error: (-215:Assertion failed) s >= 0 in function 'setSize'
]
at android.app.ActivityThread.performDestroyActivity(ActivityThread.java:4852)
at android.app.ActivityThread.handleDestroyActivity(ActivityThread.java:4915)
at android.app.ActivityThread.access$1600(ActivityThread.java:211)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1759)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:145)
at android.app.ActivityThread.main(ActivityThread.java:6946)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1404)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1199)
Caused by: CvException [org.opencv.core.CvException: cv::Exception: OpenCV(4.0.0-pre) E:AssemCoursesopencv-mastermodulescoresrcmatrix.cpp:235: error: (-215:Assertion failed) s >= 0 in function 'setSize'
]
at org.opencv.face.FaceRecognizer.train_0(Native Method)
at org.opencv.face.FaceRecognizer.train(FaceRecognizer.java:133)
at com.facedetection.app.TrainActivity.trainfaces(TrainActivity.java:95)
at com.facedetection.app.TrainActivity.onStop(TrainActivity.java:351)
at android.app.Instrumentation.callActivityOnStop(Instrumentation.java:1305)
at android.app.Activity.performStop(Activity.java:6777)
at android.app.ActivityThread.performDestroyActivity(ActivityThread.java:4847)


Here is the classe mentionned in the error logcat:



Class TrainActivity.java:



    import android.Manifest;
import android.annotation.SuppressLint;
import android.content.DialogInterface;
import android.content.pm.ActivityInfo;
import android.content.pm.PackageManager;
import android.graphics.Bitmap;
import android.os.Build;
import android.os.Bundle;
import android.os.Environment;
import android.support.annotation.NonNull;
import android.support.annotation.Nullable;
import android.support.v7.app.AlertDialog;
import android.support.v7.app.AppCompatActivity;
import android.text.InputType;
import android.util.Log;
import android.view.SurfaceView;
import android.view.View;
import android.view.ViewGroup;
import android.view.WindowManager;
import android.widget.AdapterView;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.EditText;
import android.widget.ListView;
import android.widget.TextView;
import android.widget.Toast;

import com.facebook.stetho.Stetho;

import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.MatOfRect;

import org.opencv.core.Rect;
import org.opencv.core.Size;
import org.opencv.face.Face;
import org.opencv.face.FaceRecognizer;
import org.opencv.face.LBPHFaceRecognizer;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.imgproc.Imgproc;
import org.opencv.objdetect.CascadeClassifier;

import java.io.File;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
import java.util.HashSet;
import java.util.List;
import java.util.Locale;
import java.util.Set;
import static org.opencv.objdetect.Objdetect.CASCADE_SCALE_IMAGE;

/**
* Created by Assem Abozaid on 6/2/2018.
*/

public class TrainActivity extends AppCompatActivity implements CameraBridgeViewBase.CvCameraViewListener2 {
private static String TAG = TrainActivity.class.getSimpleName();
private CameraBridgeViewBase openCVCamera;
private Mat rgba,gray;
private CascadeClassifier classifier;
private MatOfRect faces;
private static final int PERMS_REQUEST_CODE = 123;
private ArrayList<Mat> images;
private ArrayList<String> imagesLabels;
private Storage local;
private String uniqueLabels;
FaceRecognizer recognize;
private boolean trainfaces() {
if(images.isEmpty())
return false;
List<Mat> imagesMatrix = new ArrayList<>();
for (int i = 0; i < images.size(); i++)
imagesMatrix.add(images.get(i));
Set<String> uniqueLabelsSet = new HashSet<>(imagesLabels); // Get all unique labels
uniqueLabels = uniqueLabelsSet.toArray(new String[uniqueLabelsSet.size()]); // Convert to String array, so we can read the values from the indices

int classesNumbers = new int[uniqueLabels.length];
for (int i = 0; i < classesNumbers.length; i++)
classesNumbers[i] = i + 1; // Create incrementing list for each unique label starting at 1
int classes = new int[imagesLabels.size()];
for (int i = 0; i < imagesLabels.size(); i++) {
String label = imagesLabels.get(i);
for (int j = 0; j < uniqueLabels.length; j++) {
if (label.equals(uniqueLabels[j])) {
classes[i] = classesNumbers[j]; // Insert corresponding number
break;
}
}
}
Mat vectorClasses = new Mat(classes.length, 1, CvType.CV_32SC1); // CV_32S == int
vectorClasses.put(0, 0, classes); // Copy int array into a vector

recognize = LBPHFaceRecognizer.create(3,8,8,8,200);
recognize.train(imagesMatrix, vectorClasses);
if(SaveImage())
return true;

return false;
}
public void showLabelsDialog() {
Set<String> uniqueLabelsSet = new HashSet<>(imagesLabels); // Get all unique labels
if (!uniqueLabelsSet.isEmpty()) {
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle("Select label:");
builder.setNegativeButton("Cancel", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
dialog.dismiss();
images.remove(images.size()-1);
}
});
builder.setCancelable(false); // Prevent the user from closing the dialog

String uniqueLabels = uniqueLabelsSet.toArray(new String[uniqueLabelsSet.size()]); // Convert to String array for ArrayAdapter
Arrays.sort(uniqueLabels); // Sort labels alphabetically
final ArrayAdapter<String> arrayAdapter = new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1, uniqueLabels) {
@Override
public @NonNull
View getView(int position, @Nullable View convertView, @NonNull ViewGroup parent) {
TextView textView = (TextView) super.getView(position, convertView, parent);
if (getResources().getBoolean(R.bool.isTablet))
textView.setTextSize(20); // Make text slightly bigger on tablets compared to phones
else
textView.setTextSize(18); // Increase text size a little bit
return textView;
}
};
ListView mListView = new ListView(this);
mListView.setAdapter(arrayAdapter); // Set adapter, so the items actually show up
builder.setView(mListView); // Set the ListView

final AlertDialog dialog = builder.show(); // Show dialog and store in final variable, so it can be dismissed by the ListView

mListView.setOnItemClickListener(new AdapterView.OnItemClickListener() {
@Override
public void onItemClick(AdapterView<?> parent, View view, int position, long id) {
dialog.dismiss();
addLabel(arrayAdapter.getItem(position));
Log.i(TAG, "Labels Size "+imagesLabels.size()+"");
}
});
} else {
showEnterLabelDialog();
}

}
private void showEnterLabelDialog() {
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle("Please enter your name:");

final EditText input = new EditText(this);
input.setInputType(InputType.TYPE_CLASS_TEXT);
builder.setView(input);

builder.setPositiveButton("Submit", null); // Set up positive button, but do not provide a listener, so we can check the string before dismissing the dialog
builder.setNegativeButton("Cancel", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
dialog.dismiss();
images.remove(images.size()-1);
}
});
builder.setCancelable(false); // User has to input a name
AlertDialog dialog = builder.create();

// Source: http://stackoverflow.com/a/7636468/2175837
dialog.setOnShowListener(new DialogInterface.OnShowListener() {
@Override
public void onShow(final DialogInterface dialog) {
Button mButton = ((AlertDialog) dialog).getButton(AlertDialog.BUTTON_POSITIVE);
mButton.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
String string = input.getText().toString().trim();
if (!string.isEmpty()) { // Make sure the input is valid
// If input is valid, dismiss the dialog and add the label to the array
dialog.dismiss();
addLabel(string);
}
}
});
}
});
// Show keyboard, so the user can start typing straight away
dialog.getWindow().setSoftInputMode(WindowManager.LayoutParams.SOFT_INPUT_STATE_VISIBLE);

dialog.show();
}
private void addLabel(String string) {
String label = string.substring(0, 1).toUpperCase(Locale.US) + string.substring(1).trim().toLowerCase(Locale.US); // Make sure that the name is always uppercase and rest is lowercase
imagesLabels.add(label); // Add label to list of labels
Log.i(TAG, "Label: " + label);

}
public boolean SaveImage() {
File path = new File(Environment.getExternalStorageDirectory(), "TrainedData");
path.mkdirs();
String filename = "lbph_trained_data.xml";
File file = new File(path, filename);
recognize.save(file.toString());
if(file.exists())
return true;
return false;
}
public void cropedImages(Mat mat) {
Rect rect_Crop=null;
for(Rect face: faces.toArray()) {
rect_Crop = new Rect(face.x, face.y, face.width, face.height);
}
Mat croped = new Mat(mat, rect_Crop);
images.add(croped);
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.train_main);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
Stetho.initializeWithDefaults(this);

if (hasPermissions()){
Toast.makeText(this, "Permission Granted", Toast.LENGTH_SHORT).show();
Log.i(TAG, "Permission Granted Before");

}
else {
requestPerms();
}

openCVCamera = (CameraBridgeViewBase)findViewById(R.id.java_camera_view);
openCVCamera.setCameraIndex(CameraBridgeViewBase.CAMERA_ID_FRONT);
openCVCamera.setVisibility(SurfaceView.VISIBLE);
openCVCamera.setCvCameraViewListener(this);
local = new Storage(this);
Button detect = (Button)findViewById(R.id.take_picture_button);
detect.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if(gray.total() == 0)
Toast.makeText(getApplicationContext(), "Can't Detect Faces", Toast.LENGTH_SHORT).show();
classifier.detectMultiScale(gray,faces,1.1,3,0|CASCADE_SCALE_IMAGE, new Size(30,30));
if(!faces.empty()) {
if(faces.toArray().length > 1)
Toast.makeText(getApplicationContext(), "Mutliple Faces Are not allowed", Toast.LENGTH_SHORT).show();
else {
if(gray.total() == 0) {
Log.i(TAG, "Empty gray image");
return;
}
cropedImages(gray);
showLabelsDialog();
Toast.makeText(getApplicationContext(), "Face Detected", Toast.LENGTH_SHORT).show();
}
}else
Toast.makeText(getApplicationContext(), "Unknown Face", Toast.LENGTH_SHORT).show();
}
});
}
@SuppressLint("WrongConstant")
private boolean hasPermissions(){
int res = 0;
//string array of permissions,
String permissions = new String{Manifest.permission.CAMERA};

for (String perms : permissions){
res = checkCallingOrSelfPermission(perms);
if (!(res == PackageManager.PERMISSION_GRANTED)){
return false;
}
}
return true;
}
private void requestPerms(){
String permissions = new String{Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE,Manifest.permission.READ_EXTERNAL_STORAGE};
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M){
requestPermissions(permissions,PERMS_REQUEST_CODE);

}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String permissions, @NonNull int grantResults) {
boolean allowed = true;
switch (requestCode){
case PERMS_REQUEST_CODE:
for (int res : grantResults){
// if user granted all permissions.
allowed = allowed && (res == PackageManager.PERMISSION_GRANTED);
}
break;
default:
// if user not granted permissions.
allowed = false;
break;
}
if (allowed){
//user granted all permissions we can perform our task.
Log.i(TAG, "Permission has been added");
}
else {
// we will give warning to user that they haven't granted permissions.
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
if (shouldShowRequestPermissionRationale(Manifest.permission.CAMERA) || shouldShowRequestPermissionRationale(Manifest.permission.WRITE_EXTERNAL_STORAGE) ||
shouldShowRequestPermissionRationale(Manifest.permission.READ_EXTERNAL_STORAGE)){
Toast.makeText(this, "Permission Denied.", Toast.LENGTH_SHORT).show();
}
}
}
}
private BaseLoaderCallback callbackLoader = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
switch(status) {
case BaseLoaderCallback.SUCCESS:
faces = new MatOfRect();
openCVCamera.enableView();

images = local.getListMat("images");
imagesLabels = local.getListString("imagesLabels");

break;
default:
super.onManagerConnected(status);
break;
}
}
};

@Override
protected void onPause() {
super.onPause();
if(openCVCamera != null)
openCVCamera.disableView();


}
@Override
protected void onStop() {
super.onStop();
if (images != null && imagesLabels != null) {
local.putListMat("images", images);
local.putListString("imagesLabels", imagesLabels);
Log.i(TAG, "Images have been saved");
if(trainfaces()) {
images.clear();
imagesLabels.clear();
}
}
}
@Override
protected void onDestroy(){
super.onDestroy();
if(openCVCamera != null)
openCVCamera.disableView();
}
@Override
protected void onResume(){
super.onResume();
if(OpenCVLoader.initDebug()) {
Log.i(TAG, "System Library Loaded Successfully");
callbackLoader.onManagerConnected(BaseLoaderCallback.SUCCESS);
} else {
Log.i(TAG, "Unable To Load System Library");
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION, this, callbackLoader);
}
}

@Override
public void onCameraViewStarted(int width, int height) {
rgba = new Mat();
gray = new Mat();
classifier = FileUtils.loadXMLS(this, "lbpcascade_frontalface_improved.xml");
}

@Override
public void onCameraViewStopped() {
rgba.release();
gray.release();
}

@Override
public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
Mat mGrayTmp = inputFrame.gray();
Mat mRgbaTmp = inputFrame.rgba();

int orientation = openCVCamera.getScreenOrientation();
if (openCVCamera.isEmulator()) // Treat emulators as a special case
Core.flip(mRgbaTmp, mRgbaTmp, 1); // Flip along y-axis
else {
switch (orientation) { // RGB image
case ActivityInfo.SCREEN_ORIENTATION_PORTRAIT:
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_PORTRAIT:
Core.flip(mRgbaTmp, mRgbaTmp, 0); // Flip along x-axis
break;
case ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE:
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_LANDSCAPE:
Core.flip(mRgbaTmp, mRgbaTmp, 1); // Flip along y-axis
break;
}
switch (orientation) { // Grayscale image
case ActivityInfo.SCREEN_ORIENTATION_PORTRAIT:
Core.transpose(mGrayTmp, mGrayTmp); // Rotate image
Core.flip(mGrayTmp, mGrayTmp, -1); // Flip along both axis
break;
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_PORTRAIT:
Core.transpose(mGrayTmp, mGrayTmp); // Rotate image
break;
case ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE:
Core.flip(mGrayTmp, mGrayTmp, 1); // Flip along y-axis
break;
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_LANDSCAPE:
Core.flip(mGrayTmp, mGrayTmp, 0); // Flip along x-axis
break;
}
}
gray = mGrayTmp;
rgba = mRgbaTmp;
Imgproc.resize(gray, gray, new Size(200,200.0f/ ((float)gray.width()/ (float)gray.height())));
return rgba;
}
}


The Lines of Errors mentionned in the logcat refers to those java code lines:



In TrainActivity.java:



    recognize = LBPHFaceRecognizer.create(3,8,8,8,200);
recognize.train(imagesMatrix, vectorClasses);


And



     if(trainfaces()) {
images.clear();
imagesLabels.clear();
}


Do you guys have any idea about this problem and how to fix it?



PS: I am a beginner with OpenCv.










share|improve this question














I was trying to run this code to take a photo and recognize tho photo owner, at first the user clicks the button Train and takes a picture than enter its name, after that he clicks on Recognize button and take a picture if the picture was saved the apps recognize its face if not it shows unknown person for example. The app runs good but when it crashes sometimes with this error:



FATAL EXCEPTION: main
Process: com.facedetection.app, PID: 7442
java.lang.RuntimeException: Unable to stop activity {com.facedetection.app/com.facedetection.app.TrainActivity}: CvException [org.opencv.core.CvException: cv::Exception: OpenCV(4.0.0-pre) E:AssemCoursesopencv-mastermodulescoresrcmatrix.cpp:235: error: (-215:Assertion failed) s >= 0 in function 'setSize'
]
at android.app.ActivityThread.performDestroyActivity(ActivityThread.java:4852)
at android.app.ActivityThread.handleDestroyActivity(ActivityThread.java:4915)
at android.app.ActivityThread.access$1600(ActivityThread.java:211)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1759)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:145)
at android.app.ActivityThread.main(ActivityThread.java:6946)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1404)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1199)
Caused by: CvException [org.opencv.core.CvException: cv::Exception: OpenCV(4.0.0-pre) E:AssemCoursesopencv-mastermodulescoresrcmatrix.cpp:235: error: (-215:Assertion failed) s >= 0 in function 'setSize'
]
at org.opencv.face.FaceRecognizer.train_0(Native Method)
at org.opencv.face.FaceRecognizer.train(FaceRecognizer.java:133)
at com.facedetection.app.TrainActivity.trainfaces(TrainActivity.java:95)
at com.facedetection.app.TrainActivity.onStop(TrainActivity.java:351)
at android.app.Instrumentation.callActivityOnStop(Instrumentation.java:1305)
at android.app.Activity.performStop(Activity.java:6777)
at android.app.ActivityThread.performDestroyActivity(ActivityThread.java:4847)


Here is the classe mentionned in the error logcat:



Class TrainActivity.java:



    import android.Manifest;
import android.annotation.SuppressLint;
import android.content.DialogInterface;
import android.content.pm.ActivityInfo;
import android.content.pm.PackageManager;
import android.graphics.Bitmap;
import android.os.Build;
import android.os.Bundle;
import android.os.Environment;
import android.support.annotation.NonNull;
import android.support.annotation.Nullable;
import android.support.v7.app.AlertDialog;
import android.support.v7.app.AppCompatActivity;
import android.text.InputType;
import android.util.Log;
import android.view.SurfaceView;
import android.view.View;
import android.view.ViewGroup;
import android.view.WindowManager;
import android.widget.AdapterView;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.EditText;
import android.widget.ListView;
import android.widget.TextView;
import android.widget.Toast;

import com.facebook.stetho.Stetho;

import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.MatOfRect;

import org.opencv.core.Rect;
import org.opencv.core.Size;
import org.opencv.face.Face;
import org.opencv.face.FaceRecognizer;
import org.opencv.face.LBPHFaceRecognizer;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.imgproc.Imgproc;
import org.opencv.objdetect.CascadeClassifier;

import java.io.File;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
import java.util.HashSet;
import java.util.List;
import java.util.Locale;
import java.util.Set;
import static org.opencv.objdetect.Objdetect.CASCADE_SCALE_IMAGE;

/**
* Created by Assem Abozaid on 6/2/2018.
*/

public class TrainActivity extends AppCompatActivity implements CameraBridgeViewBase.CvCameraViewListener2 {
private static String TAG = TrainActivity.class.getSimpleName();
private CameraBridgeViewBase openCVCamera;
private Mat rgba,gray;
private CascadeClassifier classifier;
private MatOfRect faces;
private static final int PERMS_REQUEST_CODE = 123;
private ArrayList<Mat> images;
private ArrayList<String> imagesLabels;
private Storage local;
private String uniqueLabels;
FaceRecognizer recognize;
private boolean trainfaces() {
if(images.isEmpty())
return false;
List<Mat> imagesMatrix = new ArrayList<>();
for (int i = 0; i < images.size(); i++)
imagesMatrix.add(images.get(i));
Set<String> uniqueLabelsSet = new HashSet<>(imagesLabels); // Get all unique labels
uniqueLabels = uniqueLabelsSet.toArray(new String[uniqueLabelsSet.size()]); // Convert to String array, so we can read the values from the indices

int classesNumbers = new int[uniqueLabels.length];
for (int i = 0; i < classesNumbers.length; i++)
classesNumbers[i] = i + 1; // Create incrementing list for each unique label starting at 1
int classes = new int[imagesLabels.size()];
for (int i = 0; i < imagesLabels.size(); i++) {
String label = imagesLabels.get(i);
for (int j = 0; j < uniqueLabels.length; j++) {
if (label.equals(uniqueLabels[j])) {
classes[i] = classesNumbers[j]; // Insert corresponding number
break;
}
}
}
Mat vectorClasses = new Mat(classes.length, 1, CvType.CV_32SC1); // CV_32S == int
vectorClasses.put(0, 0, classes); // Copy int array into a vector

recognize = LBPHFaceRecognizer.create(3,8,8,8,200);
recognize.train(imagesMatrix, vectorClasses);
if(SaveImage())
return true;

return false;
}
public void showLabelsDialog() {
Set<String> uniqueLabelsSet = new HashSet<>(imagesLabels); // Get all unique labels
if (!uniqueLabelsSet.isEmpty()) {
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle("Select label:");
builder.setNegativeButton("Cancel", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
dialog.dismiss();
images.remove(images.size()-1);
}
});
builder.setCancelable(false); // Prevent the user from closing the dialog

String uniqueLabels = uniqueLabelsSet.toArray(new String[uniqueLabelsSet.size()]); // Convert to String array for ArrayAdapter
Arrays.sort(uniqueLabels); // Sort labels alphabetically
final ArrayAdapter<String> arrayAdapter = new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1, uniqueLabels) {
@Override
public @NonNull
View getView(int position, @Nullable View convertView, @NonNull ViewGroup parent) {
TextView textView = (TextView) super.getView(position, convertView, parent);
if (getResources().getBoolean(R.bool.isTablet))
textView.setTextSize(20); // Make text slightly bigger on tablets compared to phones
else
textView.setTextSize(18); // Increase text size a little bit
return textView;
}
};
ListView mListView = new ListView(this);
mListView.setAdapter(arrayAdapter); // Set adapter, so the items actually show up
builder.setView(mListView); // Set the ListView

final AlertDialog dialog = builder.show(); // Show dialog and store in final variable, so it can be dismissed by the ListView

mListView.setOnItemClickListener(new AdapterView.OnItemClickListener() {
@Override
public void onItemClick(AdapterView<?> parent, View view, int position, long id) {
dialog.dismiss();
addLabel(arrayAdapter.getItem(position));
Log.i(TAG, "Labels Size "+imagesLabels.size()+"");
}
});
} else {
showEnterLabelDialog();
}

}
private void showEnterLabelDialog() {
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle("Please enter your name:");

final EditText input = new EditText(this);
input.setInputType(InputType.TYPE_CLASS_TEXT);
builder.setView(input);

builder.setPositiveButton("Submit", null); // Set up positive button, but do not provide a listener, so we can check the string before dismissing the dialog
builder.setNegativeButton("Cancel", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
dialog.dismiss();
images.remove(images.size()-1);
}
});
builder.setCancelable(false); // User has to input a name
AlertDialog dialog = builder.create();

// Source: http://stackoverflow.com/a/7636468/2175837
dialog.setOnShowListener(new DialogInterface.OnShowListener() {
@Override
public void onShow(final DialogInterface dialog) {
Button mButton = ((AlertDialog) dialog).getButton(AlertDialog.BUTTON_POSITIVE);
mButton.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
String string = input.getText().toString().trim();
if (!string.isEmpty()) { // Make sure the input is valid
// If input is valid, dismiss the dialog and add the label to the array
dialog.dismiss();
addLabel(string);
}
}
});
}
});
// Show keyboard, so the user can start typing straight away
dialog.getWindow().setSoftInputMode(WindowManager.LayoutParams.SOFT_INPUT_STATE_VISIBLE);

dialog.show();
}
private void addLabel(String string) {
String label = string.substring(0, 1).toUpperCase(Locale.US) + string.substring(1).trim().toLowerCase(Locale.US); // Make sure that the name is always uppercase and rest is lowercase
imagesLabels.add(label); // Add label to list of labels
Log.i(TAG, "Label: " + label);

}
public boolean SaveImage() {
File path = new File(Environment.getExternalStorageDirectory(), "TrainedData");
path.mkdirs();
String filename = "lbph_trained_data.xml";
File file = new File(path, filename);
recognize.save(file.toString());
if(file.exists())
return true;
return false;
}
public void cropedImages(Mat mat) {
Rect rect_Crop=null;
for(Rect face: faces.toArray()) {
rect_Crop = new Rect(face.x, face.y, face.width, face.height);
}
Mat croped = new Mat(mat, rect_Crop);
images.add(croped);
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.train_main);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
Stetho.initializeWithDefaults(this);

if (hasPermissions()){
Toast.makeText(this, "Permission Granted", Toast.LENGTH_SHORT).show();
Log.i(TAG, "Permission Granted Before");

}
else {
requestPerms();
}

openCVCamera = (CameraBridgeViewBase)findViewById(R.id.java_camera_view);
openCVCamera.setCameraIndex(CameraBridgeViewBase.CAMERA_ID_FRONT);
openCVCamera.setVisibility(SurfaceView.VISIBLE);
openCVCamera.setCvCameraViewListener(this);
local = new Storage(this);
Button detect = (Button)findViewById(R.id.take_picture_button);
detect.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if(gray.total() == 0)
Toast.makeText(getApplicationContext(), "Can't Detect Faces", Toast.LENGTH_SHORT).show();
classifier.detectMultiScale(gray,faces,1.1,3,0|CASCADE_SCALE_IMAGE, new Size(30,30));
if(!faces.empty()) {
if(faces.toArray().length > 1)
Toast.makeText(getApplicationContext(), "Mutliple Faces Are not allowed", Toast.LENGTH_SHORT).show();
else {
if(gray.total() == 0) {
Log.i(TAG, "Empty gray image");
return;
}
cropedImages(gray);
showLabelsDialog();
Toast.makeText(getApplicationContext(), "Face Detected", Toast.LENGTH_SHORT).show();
}
}else
Toast.makeText(getApplicationContext(), "Unknown Face", Toast.LENGTH_SHORT).show();
}
});
}
@SuppressLint("WrongConstant")
private boolean hasPermissions(){
int res = 0;
//string array of permissions,
String permissions = new String{Manifest.permission.CAMERA};

for (String perms : permissions){
res = checkCallingOrSelfPermission(perms);
if (!(res == PackageManager.PERMISSION_GRANTED)){
return false;
}
}
return true;
}
private void requestPerms(){
String permissions = new String{Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE,Manifest.permission.READ_EXTERNAL_STORAGE};
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M){
requestPermissions(permissions,PERMS_REQUEST_CODE);

}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String permissions, @NonNull int grantResults) {
boolean allowed = true;
switch (requestCode){
case PERMS_REQUEST_CODE:
for (int res : grantResults){
// if user granted all permissions.
allowed = allowed && (res == PackageManager.PERMISSION_GRANTED);
}
break;
default:
// if user not granted permissions.
allowed = false;
break;
}
if (allowed){
//user granted all permissions we can perform our task.
Log.i(TAG, "Permission has been added");
}
else {
// we will give warning to user that they haven't granted permissions.
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
if (shouldShowRequestPermissionRationale(Manifest.permission.CAMERA) || shouldShowRequestPermissionRationale(Manifest.permission.WRITE_EXTERNAL_STORAGE) ||
shouldShowRequestPermissionRationale(Manifest.permission.READ_EXTERNAL_STORAGE)){
Toast.makeText(this, "Permission Denied.", Toast.LENGTH_SHORT).show();
}
}
}
}
private BaseLoaderCallback callbackLoader = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
switch(status) {
case BaseLoaderCallback.SUCCESS:
faces = new MatOfRect();
openCVCamera.enableView();

images = local.getListMat("images");
imagesLabels = local.getListString("imagesLabels");

break;
default:
super.onManagerConnected(status);
break;
}
}
};

@Override
protected void onPause() {
super.onPause();
if(openCVCamera != null)
openCVCamera.disableView();


}
@Override
protected void onStop() {
super.onStop();
if (images != null && imagesLabels != null) {
local.putListMat("images", images);
local.putListString("imagesLabels", imagesLabels);
Log.i(TAG, "Images have been saved");
if(trainfaces()) {
images.clear();
imagesLabels.clear();
}
}
}
@Override
protected void onDestroy(){
super.onDestroy();
if(openCVCamera != null)
openCVCamera.disableView();
}
@Override
protected void onResume(){
super.onResume();
if(OpenCVLoader.initDebug()) {
Log.i(TAG, "System Library Loaded Successfully");
callbackLoader.onManagerConnected(BaseLoaderCallback.SUCCESS);
} else {
Log.i(TAG, "Unable To Load System Library");
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION, this, callbackLoader);
}
}

@Override
public void onCameraViewStarted(int width, int height) {
rgba = new Mat();
gray = new Mat();
classifier = FileUtils.loadXMLS(this, "lbpcascade_frontalface_improved.xml");
}

@Override
public void onCameraViewStopped() {
rgba.release();
gray.release();
}

@Override
public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
Mat mGrayTmp = inputFrame.gray();
Mat mRgbaTmp = inputFrame.rgba();

int orientation = openCVCamera.getScreenOrientation();
if (openCVCamera.isEmulator()) // Treat emulators as a special case
Core.flip(mRgbaTmp, mRgbaTmp, 1); // Flip along y-axis
else {
switch (orientation) { // RGB image
case ActivityInfo.SCREEN_ORIENTATION_PORTRAIT:
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_PORTRAIT:
Core.flip(mRgbaTmp, mRgbaTmp, 0); // Flip along x-axis
break;
case ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE:
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_LANDSCAPE:
Core.flip(mRgbaTmp, mRgbaTmp, 1); // Flip along y-axis
break;
}
switch (orientation) { // Grayscale image
case ActivityInfo.SCREEN_ORIENTATION_PORTRAIT:
Core.transpose(mGrayTmp, mGrayTmp); // Rotate image
Core.flip(mGrayTmp, mGrayTmp, -1); // Flip along both axis
break;
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_PORTRAIT:
Core.transpose(mGrayTmp, mGrayTmp); // Rotate image
break;
case ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE:
Core.flip(mGrayTmp, mGrayTmp, 1); // Flip along y-axis
break;
case ActivityInfo.SCREEN_ORIENTATION_REVERSE_LANDSCAPE:
Core.flip(mGrayTmp, mGrayTmp, 0); // Flip along x-axis
break;
}
}
gray = mGrayTmp;
rgba = mRgbaTmp;
Imgproc.resize(gray, gray, new Size(200,200.0f/ ((float)gray.width()/ (float)gray.height())));
return rgba;
}
}


The Lines of Errors mentionned in the logcat refers to those java code lines:



In TrainActivity.java:



    recognize = LBPHFaceRecognizer.create(3,8,8,8,200);
recognize.train(imagesMatrix, vectorClasses);


And



     if(trainfaces()) {
images.clear();
imagesLabels.clear();
}


Do you guys have any idea about this problem and how to fix it?



PS: I am a beginner with OpenCv.







java android opencv






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 16 '18 at 18:13









Mr.GeekMr.Geek

237




237








  • 1





    The error message says that the assertion s >= 0 in function 'setSize' failed, so s must be less than zero. I would take a look at the sizes of imageMatrix and vectorClasses and any pertinent sizes of the objects contained within them.

    – Jonny Henly
    Nov 16 '18 at 18:16













  • imageMatrix is an arraylist of training images, and it takes the size of images that user takes, And classes is an arraylist of images labels and Vectorclasses is transformation of classes to a vector! i don't know where the error is! please check trainfaces() method.

    – Mr.Geek
    Nov 16 '18 at 21:20














  • 1





    The error message says that the assertion s >= 0 in function 'setSize' failed, so s must be less than zero. I would take a look at the sizes of imageMatrix and vectorClasses and any pertinent sizes of the objects contained within them.

    – Jonny Henly
    Nov 16 '18 at 18:16













  • imageMatrix is an arraylist of training images, and it takes the size of images that user takes, And classes is an arraylist of images labels and Vectorclasses is transformation of classes to a vector! i don't know where the error is! please check trainfaces() method.

    – Mr.Geek
    Nov 16 '18 at 21:20








1




1





The error message says that the assertion s >= 0 in function 'setSize' failed, so s must be less than zero. I would take a look at the sizes of imageMatrix and vectorClasses and any pertinent sizes of the objects contained within them.

– Jonny Henly
Nov 16 '18 at 18:16







The error message says that the assertion s >= 0 in function 'setSize' failed, so s must be less than zero. I would take a look at the sizes of imageMatrix and vectorClasses and any pertinent sizes of the objects contained within them.

– Jonny Henly
Nov 16 '18 at 18:16















imageMatrix is an arraylist of training images, and it takes the size of images that user takes, And classes is an arraylist of images labels and Vectorclasses is transformation of classes to a vector! i don't know where the error is! please check trainfaces() method.

– Mr.Geek
Nov 16 '18 at 21:20





imageMatrix is an arraylist of training images, and it takes the size of images that user takes, And classes is an arraylist of images labels and Vectorclasses is transformation of classes to a vector! i don't know where the error is! please check trainfaces() method.

– Mr.Geek
Nov 16 '18 at 21:20












1 Answer
1






active

oldest

votes


















0














You can solve it by reset preferences onCreate TrainActivity



// -------------------------------
// reset preferences
local = new Storage(this);
images = new ArrayList<>();
imagesLabels = new ArrayList<>();
local.putListMat("images", images);
local.putListString("imagesLabels", imagesLabels);
// -------------------------------





share|improve this answer
























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53343311%2fface-recognition-with-opencv-ndk-and-android%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    You can solve it by reset preferences onCreate TrainActivity



    // -------------------------------
    // reset preferences
    local = new Storage(this);
    images = new ArrayList<>();
    imagesLabels = new ArrayList<>();
    local.putListMat("images", images);
    local.putListString("imagesLabels", imagesLabels);
    // -------------------------------





    share|improve this answer




























      0














      You can solve it by reset preferences onCreate TrainActivity



      // -------------------------------
      // reset preferences
      local = new Storage(this);
      images = new ArrayList<>();
      imagesLabels = new ArrayList<>();
      local.putListMat("images", images);
      local.putListString("imagesLabels", imagesLabels);
      // -------------------------------





      share|improve this answer


























        0












        0








        0







        You can solve it by reset preferences onCreate TrainActivity



        // -------------------------------
        // reset preferences
        local = new Storage(this);
        images = new ArrayList<>();
        imagesLabels = new ArrayList<>();
        local.putListMat("images", images);
        local.putListString("imagesLabels", imagesLabels);
        // -------------------------------





        share|improve this answer













        You can solve it by reset preferences onCreate TrainActivity



        // -------------------------------
        // reset preferences
        local = new Storage(this);
        images = new ArrayList<>();
        imagesLabels = new ArrayList<>();
        local.putListMat("images", images);
        local.putListString("imagesLabels", imagesLabels);
        // -------------------------------






        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Dec 27 '18 at 11:46









        ahmednabil88ahmednabil88

        3,27272967




        3,27272967
































            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53343311%2fface-recognition-with-opencv-ndk-and-android%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Florida Star v. B. J. F.

            Error while running script in elastic search , gateway timeout

            Adding quotations to stringified JSON object values